Compare commits

..

628 Commits

Author SHA1 Message Date
D. Berge
4c2a2617a1 Adapt Project component to Vuex use for fetching data.
The Project component is now responsible for fetching and
updating the data used by most project tabs, with the
exception of ProjectSummary, QC, Graphs and Map. It is
also the only one listening for server events and reacting
to them.

Individual tabs are still responsible for sending data to
the server, at least for the time being.
2023-10-25 16:19:18 +02:00
D. Berge
5021888d03 Adapt Log component to Vuex use for fetching data 2023-10-25 16:18:41 +02:00
D. Berge
bf633f7fdf Refactor Calendar component.
- adapts it to Vuex use for fetching data
- displays extra events in 4-day and day views
- allows classifying by event label in 4-day and day views
2023-10-25 16:16:01 +02:00
D. Berge
847f49ad7c Adapt SequenceList component to Vuex use for fetching data 2023-10-25 16:15:17 +02:00
D. Berge
171feb9dd2 Adapt Plan component to Vuex use for fetching data 2023-10-25 16:14:45 +02:00
D. Berge
503a0de12f Adapt LineList component to Vuex use for fetching data 2023-10-25 16:13:56 +02:00
D. Berge
cf89a43f64 Add project configuration to Vuex store 2023-10-25 16:11:24 +02:00
D. Berge
680e376ed1 Add Vuex sequence module 2023-10-25 16:11:24 +02:00
D. Berge
a26974670a Add Vuex plan module 2023-10-25 16:11:24 +02:00
D. Berge
16a6cb59dc Add Vuex line module 2023-10-25 16:11:24 +02:00
D. Berge
829e206831 Add Vuex label module 2023-10-25 09:59:04 +02:00
D. Berge
83244fcd1a Add Vuex event module 2023-10-25 09:51:28 +02:00
D. Berge
851369a0b4 Invalidate planner endpoint cache when setting remarks 2023-10-23 14:58:41 +02:00
D. Berge
5065d62443 Update planner endpoint documentation 2023-10-23 14:57:27 +02:00
D. Berge
2d1e1e9532 Modify return payload of planner endpoint.
Previous:

[
  { sequence: …},
  { sequence: …},
  …
]

Current:

{
  remarks: "…",
  sequences: [
    { sequence: …},
    { sequence: …},
    …
  ]
}
2023-10-23 14:53:32 +02:00
D. Berge
051049581a Merge branch '278-rewrite-events-queue' into 'devel'
Resolve "Rewrite events queue"

Closes #278

See merge request wgp/dougal/software!46
2023-10-17 10:28:21 +00:00
D. Berge
da5ae18b0b Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into devel 2023-10-17 12:27:31 +02:00
D. Berge
ac9353c101 Add database upgrade file 31. 2023-10-17 12:27:06 +02:00
D. Berge
c4c5c44bf1 Add comment 2023-10-17 12:20:19 +02:00
D. Berge
d3659ebf02 Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into 'devel'
Resolve "Support requesting a partial update from the events log endpoint"

Closes #269

See merge request wgp/dougal/software!47
2023-10-17 10:18:41 +00:00
D. Berge
6b5070e634 Add event changes API endpoint description 2023-10-17 12:15:41 +02:00
D. Berge
09ff96ceee Add events change API endpoint 2023-10-17 11:15:36 +02:00
D. Berge
f231acf109 Add events change middleware 2023-10-17 11:15:06 +02:00
D. Berge
e576e1662c Add library function returning event changes after given epoch 2023-10-17 11:13:58 +02:00
D. Berge
6a21ddd1cd Rewrite events listener and handlers.
The events listener now uses a proper self-consuming queue and
the event handlers have been rewritten accordingly.

The way this works is that running init() on the handlers
library instantiates the handlers and returns two higher-order
functions, prepare() and despatch(). A call to the latter of
these is appended to the queue with each new incoming event.

The handlers have access to a context object (ctx) which may be
used to persist data between calls and/or exchange data between
handlers. This is used notably to give the handlers access to
project configurations, which are themselves refreshed by a
project configuration change handler (DetectProjectConfigurationChange).
2023-10-14 20:53:42 +02:00
D. Berge
c1e35b2459 Cache project configuration details.
This avoids requesting the project configurations on every single
incoming message. A listener refreshes the data on configuration
changes.
2023-10-14 20:11:18 +02:00
D. Berge
eee2a96029 Modify logging statements 2023-10-14 20:10:46 +02:00
D. Berge
6f5e5a4d20 Fix bug for shortcut when there is only one candidate project 2023-10-14 20:09:07 +02:00
D. Berge
9e73cb7e00 Clean up on SIGINT, SIGHUP signals 2023-10-14 20:07:19 +02:00
D. Berge
d7ab4eec7c Run some tasks periodically from the main process.
This reduces reliance on crontab jobs.
2023-10-14 20:06:38 +02:00
D. Berge
cdd96a4bc7 Don't bother trying to kill the child process on exit.
As the exit signal handler does not allow asynchronous tasks and
besides, killing the parent should kill its children too.
2023-10-14 20:02:54 +02:00
D. Berge
39a21766b6 Exit on start up errors 2023-10-14 20:02:04 +02:00
D. Berge
0e33c18b5c Replace console.log() with debug library calls 2023-10-14 19:57:57 +02:00
D. Berge
7f411ac7dd Add queue libraries.
A basic queue implementation and one that consumes its items
automatically until empty.
2023-10-14 19:56:56 +02:00
D. Berge
ed1da11c9d Add helper function to purge notifications 2023-10-14 19:54:34 +02:00
D. Berge
66ec28dd83 Refactor DB notifications listener to support large payloads.
The listener will automatically retrieve the full payload
before passing it on to event handlers.
2023-10-14 18:33:41 +02:00
D. Berge
b928d96774 Add database upgrade file 30. 2023-10-14 18:29:28 +02:00
D. Berge
73335f9c1e Merge branch '136-add-line-change-time-log-pseudoevent' into 'devel'
Resolve "Add line change time log pseudoevent"

Closes #136

See merge request wgp/dougal/software!45
2023-10-04 12:50:49 +00:00
D. Berge
7b6b81dbc5 Add more debugging statements 2023-10-04 14:50:12 +02:00
D. Berge
2e11c574c2 Throw rather than return.
Otherwise the finally {} block won't run.
2023-10-04 14:49:35 +02:00
D. Berge
d07565807c Do not retry immediately 2023-10-04 14:49:09 +02:00
D. Berge
6eccbf215a There should be no need to await.
That is because the queue handler will, in theory, only ever
process one event at a time.
2023-09-30 21:29:15 +02:00
D. Berge
8abc05f04e Remove dead code 2023-09-30 21:29:15 +02:00
D. Berge
8f587467f9 Add comment 2023-09-30 21:29:15 +02:00
D. Berge
3d7a91c7ff Rewrite ReportLineChangeTime 2023-09-30 21:29:15 +02:00
D. Berge
3fd408074c Support passing array in opts.sequences to event.list() 2023-09-30 21:29:15 +02:00
D. Berge
f71cbd8f51 Add unique utility function 2023-09-30 21:29:15 +02:00
D. Berge
915df8ac16 Add handler for creation of line change time events 2023-09-30 21:29:15 +02:00
D. Berge
d5ecb08a2d Allow switching to event entry by time.
A ‘Timed’ button is shown when a new (not edited) event is in
the event entry dialogue and the event has sequence and/or
point values. Pressing the button deletes the sequence/point
information and sets the date and time fields to current time.

Fixes #277.
2023-09-30 21:26:32 +02:00
D. Berge
9388cd4861 Make daily_tasks work with new project configuration 2023-09-30 20:36:46 +02:00
D. Berge
180590b411 Mark events as being automatically generated 2023-09-30 01:42:27 +02:00
D. Berge
4ec37539bf Add utils to work with Postgres ranges 2023-09-30 01:41:45 +02:00
D. Berge
8755fe01b6 Refactor events.list.
The SQL has been simplified and the following changes made:

- The `sequence` argument now can only take one individual
  sequence, not a list of sequences.
- A new `sequences` argument is recognised. It takes a list
  of sequences (as a string).
- A new `label` argument is recognised. It takes a label
  name and returns events containing that label.
- A new `jpq` argument is recognised. It takes a JSONPath
  string which is applied to `meta` with jsonb_path_exists(),
  returning any events for which the JSON path expression
  matches.
2023-09-30 01:37:22 +02:00
D. Berge
0bfe54e0c2 Include the meta attribute when posting events 2023-09-30 01:36:18 +02:00
D. Berge
29bc689b84 Merge branch '276-add-soft-start-event-detection' into 'devel'
Resolve "Add soft start event detection"

Closes #276

See merge request wgp/dougal/software!44
2023-09-29 15:02:57 +00:00
D. Berge
65682febc7 Add soft start and full volume events detection 2023-09-29 17:02:03 +02:00
D. Berge
d408665d62 Write meta info to automatic events 2023-09-29 16:49:27 +02:00
D. Berge
64fceb0a01 Merge branch '127-sol-eol-events-not-being-inserted-in-the-log-automatically' into 'devel'
Resolve "SOL / EOL events not being inserted in the log automatically"

Closes #127

See merge request wgp/dougal/software!43
2023-09-29 14:17:46 +00:00
D. Berge
ab58e578c9 Use DEBUG library throughout 2023-09-29 16:16:33 +02:00
D. Berge
0e58b8fa5b Refactor code to identify candidate schemas.
As part of the refactoring, we took into account a slight payload
format change (project configuration details are under the `data`
attribute).
2023-09-29 16:13:35 +02:00
D. Berge
99ac082f00 Use common naming convention both online and offline 2023-09-29 16:11:44 +02:00
D. Berge
4d3fddc051 Merge branch '274-use-new-db-event-notifier-for-event-processing-handlers' into 'devel'
Resolve "Use new DB event notifier for event processing handlers"

Closes #275, #230, and #274

See merge request wgp/dougal/software!42
2023-09-29 14:03:00 +00:00
D. Berge
42456439a9 Remove ad-hoc notifier 2023-09-29 15:59:12 +02:00
D. Berge
ee0c0e7308 Replace ad-hoc notifier with pg-listen based version 2023-09-29 15:59:12 +02:00
D. Berge
998c272bf8 Add var/* to .gitignore 2023-09-29 15:59:12 +02:00
D. Berge
daddd1f0e8 Add script to rewrite packet captures IP and MAC addresses.
Closes #230.
2023-09-29 15:58:59 +02:00
D. Berge
17f20535cb Cope with fragmented UDP packets.
Fixes #275.

Use this as the systemd unit file to run as a service:

[Unit]
Description=Dougal Network Packet Capture
After=network.target remote-fs.target nss-lookup.target

[Service]
ExecStart=/srv/dougal/software/sbin/packet-capture.sh
ExecStop=/bin/kill -s QUIT $MAINPID
Restart=always
User=root
Group=users
Environment=PATH=/usr/bin:/usr/sbin:/usr/local/bin
Environment=INS_HOST=172.31.10.254
WorkingDirectory=/srv/dougal/software/var/
SyslogIdentifier=dougal.pcap

[Install]
WantedBy=multi-user.target
2023-09-29 15:28:11 +02:00
D. Berge
0829ea3ea1 Save a copy of the headers not the original.
Otherwise ExpressJS will complain about trying to modify
headers that have already been sent.
2023-09-24 12:17:16 +02:00
D. Berge
2069d9c3d7 Remove dead code 2023-09-24 12:15:06 +02:00
D. Berge
8a2d526c50 Ignore schema attribute in PATCH payload.
Fixes #273.
2023-09-24 12:14:20 +02:00
D. Berge
8ad96d6f73 Ensure that requiredFields is always defined.
Otherwise, `Object.entries(requiredFields)` may fail.
2023-09-24 11:59:26 +02:00
D. Berge
947faf8c05 Provide default glob specification for map layer imports 2023-09-24 11:34:10 +02:00
D. Berge
a948556455 Fail gracefully if map layer data does not exist.
Fixes #272.
2023-09-24 11:33:32 +02:00
D. Berge
835384b730 Apply path conversion to QC definition files 2023-09-23 22:50:09 +02:00
D. Berge
c5b93794f4 Move path conversion to general utilities 2023-09-23 13:44:53 +02:00
D. Berge
056cd32f0e Merge branch '271-qc-results-not-being-refreshed' into 'devel'
Resolve "QC results not being refreshed"

Closes #271

See merge request wgp/dougal/software!41
2023-09-18 10:08:35 +00:00
D. Berge
49bb413110 Merge branch '270-real-time-interface-stopped-working' into 'devel'
Resolve "Real-time interface stopped working"

Closes #270

See merge request wgp/dougal/software!40
2023-09-18 10:08:27 +00:00
D. Berge
ceccc42050 Don't cache response ETags for QC endpoints 2023-09-18 12:06:38 +02:00
D. Berge
aa3379e1c6 Adapt RTI save function to refactored project configuration in DB 2023-09-18 11:58:55 +02:00
D. Berge
4063af0e25 Merge branch '268-inline-crossline-errors-no-longer-being-calculated' into 'devel'
Resolve "Inline/crossline errors no longer being calculated"

Closes #268

See merge request wgp/dougal/software!39
2023-09-15 18:03:51 +00:00
D. Berge
d53e6060a4 Update database templates to v0.4.2 2023-09-15 20:01:54 +02:00
D. Berge
85d8fc8cc0 Update required database version 2023-09-15 14:22:22 +02:00
D. Berge
0fe40b1839 Add missing require 2023-09-15 14:22:02 +02:00
D. Berge
21de4b757f Add database upgrade file 29. 2023-09-15 12:52:42 +02:00
D. Berge
96cdbb2cff Add database upgrade file 28. 2023-09-15 12:52:27 +02:00
D. Berge
d531643b58 Add database upgrade file 27. 2023-09-15 12:52:06 +02:00
D. Berge
a1779ef488 Do not cache /navdata endpoint responses 2023-09-14 13:20:16 +02:00
D. Berge
5239dece1e Do not cache GIS endpoint responses 2023-09-14 13:19:57 +02:00
D. Berge
a7d7837816 Allow only admins to patch project configurations 2023-09-14 13:19:16 +02:00
D. Berge
ebcfc7df47 Allow everyone to access project configuration.
This is necessary as it is requested by various parts of the
frontend.

Consider more granular access control.
2023-09-14 13:17:28 +02:00
D. Berge
dc4b9002fe Adapt QC endpoints to new configuration APIs 2023-09-14 13:15:59 +02:00
D. Berge
33618b6b82 Do not cache Set-Cookie headers 2023-09-14 13:13:47 +02:00
D. Berge
597d407acc Adapt QC view to new label payload from API 2023-09-14 13:13:18 +02:00
D. Berge
6162a5bdee Stop importing P1/90s until scripts are upgraded.
See #266.
2023-09-14 13:09:38 +02:00
D. Berge
696bbf7a17 Take etc/config.yaml out of revision control.
This file contains site-specific configuration. Instead, an
example config.yaml is now provided.
2023-09-14 13:07:33 +02:00
D. Berge
821fcf0922 Add wx forecast info to plan (experiment).
Use https://open-meteo.com/ as a weather forecast provider.

This code is intended for demonstration only, not for
production purposes.

(issue #157)


(cherry picked from commit cc4bce1356)
2023-09-13 20:04:15 +00:00
D. Berge
b1712d838f Merge branch '245-export-event-log-as-csv' into 'devel'
Resolve "Export event log as CSV"

Closes #245

See merge request wgp/dougal/software!38
2023-09-13 20:02:07 +00:00
D. Berge
895b865505 Expose CSV output option in user interface 2023-09-13 21:59:57 +02:00
D. Berge
5a2af5c49e Add CSV output option for events log 2023-09-13 21:58:06 +02:00
D. Berge
24658f4017 Allow patching project name if no name is already set 2023-09-13 16:13:43 +02:00
D. Berge
6707cda75e Ignore case when patching configuration ID 2023-09-13 16:13:12 +02:00
D. Berge
1302a31b3d Improve formatting of layer alert 2023-09-13 13:00:19 +02:00
D. Berge
871a1e8f3a Don't show alert if layer is empty (but log to console) 2023-09-13 12:59:47 +02:00
D. Berge
04e1144bab Simplify expression 2023-09-13 12:59:24 +02:00
D. Berge
6312d94f3e Add support for user layer tooltips and popups 2023-09-13 12:58:44 +02:00
D. Berge
ed91026319 Add tolltip and popup options to map layer configuration.
- `tooltip` takes the name of a GeoJSON property that will be
  shown in a tooltip when hovering the mouse over a feature.

- `popup` can take either the name of a property as above, or
  the boolean value `true`. In the latter case, a table of all
  the feature's properties will be shown when clicking on the
  feature. In the former case, only the value of the designated
  property will be shown.
2023-09-13 12:55:37 +02:00
D. Berge
441a4e296d Import map layers from the runner 2023-09-13 11:24:04 +02:00
D. Berge
c33c3f61df Alert the user if a map layer is too big 2023-09-13 11:22:49 +02:00
D. Berge
2cc293b724 Do not fail trying to restore state for non-existing layers 2023-09-13 11:22:05 +02:00
D. Berge
ee129b2faa Merge branch '114-allow-users-to-show-arbitrary-geojson-on-the-map' into 'devel'
Resolve "Allow users to show arbitrary GeoJSON on the map."

Closes #114

See merge request wgp/dougal/software!37
2023-09-12 17:34:51 +00:00
D. Berge
98d9b3b093 Adapt Map view to new label payload from API 2023-09-12 19:31:58 +02:00
D. Berge
57b9b420f8 Show an error if a layer is too large.
The map view limits the size of layers (both user and regular) in
order to keep the system responsive, as Leaflet is not great at
handling large layers.
2023-09-12 19:29:02 +02:00
D. Berge
9e73f2603a Implement user layers on map view.
The user layers are defined in the project configuration under
`imports.map.layers`.

Multiple layers may be defined and each layer may consist of one
or more GeoJSON files. Files are retrieved via the /files/ API
endpoint.
2023-09-12 19:29:02 +02:00
D. Berge
707889be42 Refactor layer API endpoint and database functions.
- A single get() function is used both to list all available
  layers, if no layer name is given, or a single layer.
- The database no longer holds the actual layer contents,
  only the path to the layer file(s), so the list() function
  is now redundant as we return the full payload in every case.
- The /gis/layer and /gis/layer/:name endpoints now have the same
  payload structure.
2023-09-12 19:29:02 +02:00
D. Berge
f9a70e0145 Refactor map layer importer.
- Now a layer may consist of a path pointing to a directory plus a
  glob, or a path pointing directly to a single file.
- If a file already exists in the database, check if the layer
  name has changed and if so, update it.
- Do not import the actual file contents, as the path is enough
  (it can be retrieved via the /file/:path API endpoint).
2023-09-12 11:05:10 +02:00
D. Berge
b71489cee1 Add get_file_data() function to datastore 2023-09-12 11:04:37 +02:00
D. Berge
0a9bde5f10 Add Background layer to map.
This is a limited implementation of layer backgrounds. The API
supports an arbitrary number of arbitrarily named background
layers, but for the time being we only recognise one background
layer named `Background` and of GeoJSON type.

Certain properties, such a colour/color, opacity, etc., are
recognised and applied as feature styles. If not, a default
style is used.
2023-09-11 10:17:10 +02:00
D. Berge
36d5862375 Add map layer middleware and API endpoints 2023-09-11 10:15:19 +02:00
D. Berge
398c702004 Add map layer functions to database interface 2023-09-11 10:12:46 +02:00
D. Berge
b2d1798338 Add map layer importer 2023-09-11 10:00:59 +02:00
D. Berge
4f165b0c83 Revert behaviour of new jwt-express version.
Fixes breakage introduced in commit
cd00f8b995.
2023-09-10 14:09:01 +02:00
D. Berge
2c86944a51 Merge branch '262-preset-remarks-and-labels-no-longer-working-with-api-0-4-0' into 'devel'
Resolve "Preset remarks and labels no longer working with API 0.4.0"

Closes #262

See merge request wgp/dougal/software!36
2023-09-10 10:10:22 +00:00
D. Berge
5fc51de7d8 Adapt Log view to new configuration endpoint in the API 2023-09-10 12:01:59 +02:00
D. Berge
158e0fb788 Adapt Log view to new label payload from API 2023-09-10 12:01:30 +02:00
D. Berge
941d15c1bc Return labels directly from project configuration.
NOTE: This is a breaking API change. Before this it returned an
array of labels, now it returns an object.
2023-09-10 11:59:38 +02:00
D. Berge
cd00f8b995 Breaking-change Node package udpates (server) 2023-09-10 11:49:56 +02:00
D. Berge
44515f8e78 Non-breaking Node package updates (server) 2023-09-09 20:54:04 +02:00
D. Berge
54fbc76da5 Merge branch '261-wrong-missing-shots-value-in-sequence-summary' into 'devel'
Resolve "Wrong missing shots value in sequence summary"

Closes #261

See merge request wgp/dougal/software!35
2023-09-09 18:46:33 +00:00
D. Berge
c1b5196134 Update database templates to v0.3.12.
Incorporates fix for bug #261.
2023-09-09 20:45:11 +02:00
D. Berge
fb3d3be546 Trailing slash in API call results in "unauthorised" error.
No idea why.
2023-09-09 20:39:49 +02:00
D. Berge
8e11e242ed Remove NODE_OPTIONS from scripts.
Node version 18 does not seem to like it.
2023-09-09 20:37:08 +02:00
D. Berge
8a815ce3ef Add database upgrade file 26. 2023-09-09 20:23:20 +02:00
D. Berge
91076a50ad Show API error messages if available 2023-09-09 17:00:32 +02:00
D. Berge
e624dcdde0 Support async API callbacks in Vuex action 2023-09-09 16:59:43 +02:00
D. Berge
a25676122c Update material design icons dependency 2023-09-09 16:58:44 +02:00
D. Berge
e4dfbe2c9a Update minimum node version to 18 2023-09-09 16:57:20 +02:00
D. Berge
78fb34d049 Update the API version number 2023-09-09 16:56:52 +02:00
D. Berge
38c4125f4f Support patching values out of the configuration.
A configuration patch having keys with null values will result
in those keys being removed from the configuration.
2023-09-09 16:53:42 +02:00
D. Berge
04d6cbafe3 Use refactored database API in QC executable 2023-09-09 16:42:30 +02:00
D. Berge
e6319172d8 Fix typo in QC executable 2023-09-09 16:42:00 +02:00
D. Berge
5230ff63e3 Use new database API calls for configuration 2023-09-09 16:39:53 +02:00
D. Berge
2b364bbff7 Make bin script compatible with Python 3.6 2023-09-09 16:38:51 +02:00
D. Berge
c4b330b2bb Don't cache ETags for /files/ endpoint.
As we have no practical way of invalidating those.
2023-09-02 16:06:31 +02:00
D. Berge
308eda6342 Use ETag middleware 2023-09-02 15:29:39 +02:00
D. Berge
e8b1cb27f1 Add ETag middleware 2023-09-02 15:29:24 +02:00
D. Berge
ed14fd0ced Add notifier to DB library 2023-09-02 15:28:17 +02:00
D. Berge
fb10e56487 Add pg-listen dependency 2023-09-02 15:26:53 +02:00
D. Berge
56ed0cbc79 Merge branch '246-add-endpoint-for-creating-a-new-survey' into 'devel'
Resolve "Add endpoint for creating a new survey"

Closes #179, #174, and #246

See merge request wgp/dougal/software!29
2023-09-02 13:10:56 +00:00
D. Berge
227e588782 Merge branch '248-dougal-event-log-takes-a-long-time-to-register-new-events' into 'devel'
Resolve "Dougal event log takes a long time to register new events"

Closes #248

See merge request wgp/dougal/software!30
2023-09-02 13:09:56 +00:00
D. Berge
53f2108e37 Adapt import functions to use logical paths 2023-08-30 14:56:09 +02:00
D. Berge
ccf4bbf547 Use logical paths rather than physical 2023-08-30 14:54:27 +02:00
D. Berge
c99a625b60 Add function to retrieve survey configurations from DB.
As the survey definitions will no longer be stored in files
under etc/surveys/ but directly on the database, this
function replaces configuration.surveys()
2023-08-30 14:27:15 +02:00
D. Berge
25ab623328 Add functions for translating paths.
The Dougal database will no longer store physical file paths
but rather logical ones, relative to (config.yaml).imports.paths.

These functions translate between physical and logical paths.
2023-08-30 14:17:47 +02:00
D. Berge
455888bdac Fix method signature 2023-08-30 14:16:08 +02:00
D. Berge
b650ece0ee Add import.paths key to config.yaml.
Used to tell Dougal which parts of the filesystem may be
accessed by users via the API (more specifically, via the
`/files/` API endpoints).
2023-08-30 14:12:07 +02:00
D. Berge
2cb96c0252 Let user download P1s from the Sequences tab 2023-08-30 14:08:28 +02:00
D. Berge
70cf59bb4c Add API files endpoint.
Used to download files. It relies on `imports.paths` being set
appropriately in `etc/config.yaml` to indicate which parts of
the filesystem are accessible to users via Dougal.
2023-08-30 13:51:31 +02:00
D. Berge
ec03627119 Remove logging statements 2023-08-30 13:48:26 +02:00
D. Berge
675c19f060 Fix whitespace 2023-08-30 13:47:51 +02:00
D. Berge
6721b1b96b Add API endpoint for patching a project 2023-08-30 13:47:02 +02:00
D. Berge
b4f23822c4 Fix db.configuration.get() 2023-08-30 13:43:36 +02:00
D. Berge
3dd1aaeddb Fix indentation 2023-08-30 13:42:25 +02:00
D. Berge
1e593e6d75 Clean up if project creation fails 2023-08-30 13:41:28 +02:00
D. Berge
ddbcb90c1f Add deepMerge() utility function 2023-08-30 13:37:01 +02:00
D. Berge
229fdf20ef Reload the project list on insert or deletion 2023-08-23 19:35:12 +02:00
D. Berge
72e67d0e5d React to project deletion 2023-08-23 19:34:47 +02:00
D. Berge
b26fefbc37 Show user-friendly message if a project cannot be found 2023-08-23 19:33:50 +02:00
D. Berge
04e0482f60 Vuex: add getters for project info 2023-08-23 19:31:22 +02:00
D. Berge
62f90846a8 Vuex: clear project variables if project not found 2023-08-23 19:30:52 +02:00
D. Berge
1f9c0e56fe Default npm run serve to 0.0.0.0 2023-08-23 19:29:13 +02:00
D. Berge
fe9d3563a0 Add API endpoint to delete a project 2023-08-23 19:26:27 +02:00
D. Berge
38a07dffc6 Add API endpoint to retrieve project configuration.
Only available to users with at least `write` access.
2023-08-23 19:26:27 +02:00
D. Berge
1a6500308f Add API endpoint for creating a project 2023-08-23 19:26:27 +02:00
D. Berge
6033b45ed3 Refactor API middleware.
The middleware naming is kept consistent with the HTTP verb that
they handle.
2023-08-23 19:17:20 +02:00
D. Berge
33edef6647 Use modified body-parser accepting YAML 2023-08-23 19:12:44 +02:00
D. Berge
8f8e8b7492 Implement db.project.delete().
Removes a project from the database, but only if the project is
empty, i.e., it has no preplots, no lines and no events in its
log (except deleted).
2023-08-21 14:50:20 +02:00
D. Berge
ab5e3198aa Add DB function to return project configuration.
NOTE: mostly redundant with db.configuration.get(),
see previous commit.
2023-08-21 14:49:22 +02:00
D. Berge
60ed850d2d Change db.configuration.get() to use database.
NOTE: this endpoint is redundant with db.project.configuration.get()
except for its ability to return a partial tree.

TODO: merge this with db.project.configuration.get().
2023-08-21 14:46:51 +02:00
D. Berge
63b9cc5b16 Add database functions for project creation.
Instead of storing the project configuration in a YAML file
under `etc/surveys/`, this is now stored in public.projects.meta.

NOTE: as of this commit, the runner scripts (`bin/*.py`) are not
aware of this change and they will keep looking for project info
under `etc/surveys`. This means that projects created directly
in the database will be invisible to Dougal until the runner
scripts are changed accordingly.
2023-08-21 14:39:45 +02:00
D. Berge
f2edd2bec5 Refactor project DB functions.
The old db.project.list() function is now db.project.get()
and the old db.project.get() is not db.project.summary.get().

If a project does not exist, db.project.summary.get() now
throws a 404 rather than a database error.
2023-08-21 14:36:02 +02:00
D. Berge
44ad59130f Add pid2schema function.
Translates a project ID into a database schema name.
2023-08-21 14:31:23 +02:00
D. Berge
ecbb1e04ee Do not disable event edit form while loading data 2023-08-20 20:07:29 +02:00
D. Berge
7cb2c3ef49 Add comment 2023-05-30 17:20:35 +02:00
D. Berge
ff4f6bfd78 Ensure that we're connected to the Dougal database 2023-05-30 17:19:23 +02:00
D. Berge
fbe0cb5efa Default the API prefix to /api 2023-05-18 18:34:10 +02:00
D. Berge
aa7cbed611 Do not require authentication to query API version 2023-05-18 18:32:26 +02:00
D. Berge
89061f6411 Print port and prefix on startup 2023-05-18 18:30:48 +02:00
D. Berge
838883d8a3 Update caniuse version (package-lock) 2023-05-18 18:29:44 +02:00
D. Berge
cd196f1acd Add option needed for node v16+ support.
Note: this may cause the client *not* to start on node versions
less than 16.
2023-05-18 18:28:36 +02:00
D. Berge
a2b894fceb Fix class instantiation error.
Closes #252.
2023-05-12 15:32:12 +02:00
D. Berge
c3b3a4c70f Remove lock file if inhibiting tasks 2023-04-11 20:50:59 +02:00
D. Berge
8118641231 Do not run tasks if required mounts are not present.
A configuration item `imports.mounts` is added to
`etc/config.yaml`. This should be a list of paths
which must be non-empty. If any of the paths in that
list is empty, runner.sh will abort.

Closes #200.
2023-04-10 15:04:12 +02:00
D. Berge
6d8a199a3c Allow setting IP to listen on.
Running on bare metal, 127.0.0.1 is a sensible choice of address
to bind on, but that is not the case when running inside a
container, so we add the ability to choose which IP to listen on.

This can be given via the environment variable HTTP_HOST when
starting the server or, if used as a module, as the second
argument of the start(port, host, path) function.
2023-04-07 09:04:51 +02:00
D. Berge
5a44e20a5b Do not error out of npm install if postinstall fails.
The postinstall script will (rightly) return non-zero if the API
docs cannot be built, but this creates a problem when building a
container (Docker) image. In that case, we expect the postinstall
to fail, as the required files (spec/*) will not have been copied
into the image when `npm install` is run.

By adding an explicit OR clause we allow postinstall to end
gracefully whether or not the API docs have been built.
2023-04-05 12:32:35 +02:00
D. Berge
374739c133 Request ancillary library via HTTPS rather than SSH.
Otherwise newer versions of npm will choke during `npm install` due
to this npm bug: https://github.com/npm/cli/issues/2610
2023-04-04 18:16:50 +02:00
D. Berge
992205da4a Add event handler for midnight shot detection.
This event handler checks if there is an UTC date jump between
consecutive shots. If a jump is detected, it sends to new entries
to the event log, for the last shot and first shot of the previous
and current dates, respectively.

Fixes #223.
2022-05-15 14:06:18 +02:00
D. Berge
f5e08c68af Replace console output by debug functions 2022-05-15 13:38:47 +02:00
D. Berge
105fee0623 Update database schema template.
* midnight_shots uses final_shots rather than raw_shots
* log_midnight_shots removes stale midnight events
2022-05-15 13:28:15 +02:00
D. Berge
aff974c03f Modify log_midnight_shots() to remove non-relevant midnight shots.
Those shots could for instance have been removed to a line edit.
2022-05-15 13:20:01 +02:00
D. Berge
bada6dc2e2 Modify DB upgrade file 25 to use final_shots 2022-05-15 13:19:01 +02:00
D. Berge
d5aac5e84d Add network packet capture script.
The idea is to capture incoming real-time data to be able to
replay it later on development systems, e.g., for new development
or troubleshooting.

Issue #230.
2022-05-14 11:57:09 +02:00
D. Berge
3577a2ba4a Change sass version specification in package-lock.
Should stop `npm install` from modifying it.
2022-05-13 19:19:45 +02:00
D. Berge
04df9f41cc Add script for daily database housekeeping.
The script bin/daily_tasks.py is intended to be run shortly after
midnight every day (e.g., via cron).

At the moment it inserts any missing LDSP / FDSP events. It can
be extended with other tasks as needed either by expanding
Datastore.run_daily_tasks() or by adding to bin/daily_tasks.py.

Fixes #223.
2022-05-13 19:04:39 +02:00
D. Berge
fdb5e0cbab Update database templates to v0.3.12.
* Add midnight_shots view
* Add log_midnight_shots() procedure
2022-05-13 18:55:43 +02:00
D. Berge
4b832babfd Add database upgrade file 25.
This defines a midnight_shots view and a log_midnight_shots() procedure
(with some overloads). The view returns all points straddling midnight
UTC and belonging to the same sequence (so last shot of the day and
first shot of the next day).

The procedure inserts the corresponding events (optionally constrained
by an earliest and a latest date) in the event log, unless the events
already exist.

Related to #223.
2022-05-13 18:53:32 +02:00
D. Berge
cc3a9b4e5c Fix comment 2022-05-13 18:52:40 +02:00
D. Berge
da5a708760 Add controls to hide accepted / all QC events.
Closes #218, #219.
2022-05-13 18:17:02 +02:00
D. Berge
9834e85eb9 Add placeholders hint, for discoverability 2022-05-12 23:38:39 +02:00
D. Berge
e19601218a Cope with schema not being detected 2022-05-12 23:04:07 +02:00
D. Berge
15c56d3f64 Use new debug() functions 2022-05-12 23:03:31 +02:00
D. Berge
632dd1ee75 Add placeholder replacement to log housekeeping tasks 2022-05-12 22:57:23 +02:00
D. Berge
aeff5a491d Update required database schema 2022-05-12 22:55:08 +02:00
D. Berge
9179c9332d Revert "Show sequence comments in log page"
This reverts commit a5db9c984b.

Fixes #210.
2022-05-12 22:46:11 +02:00
D. Berge
bb5de9a00e Update insert_event.py.
This script now works with the new event log.

Fixes #234. Midnight positions can be added via a cronjob such
as:

$DOUGAL_ROOT/BIN/insert_event.py -t "$(date -I) 00:00:00Z" \
    -l Daily -l Prod \
    "Midnight position: @DMS@ (@POS@)"
2022-05-12 22:21:38 +02:00
D. Berge
d6b985fcd2 Replace event remarks placeholders in API data.
Events being created or edited via the API now call
replace_placeholders() making it possible to use
shortcuts to enter some event-related information.

See #229 for details.
2022-05-12 22:10:33 +02:00
D. Berge
3ed8339aa3 Migrate more console messages to debug() 2022-05-12 22:09:08 +02:00
D. Berge
1b925502bc Update database templates to v0.3.11.
* Redefine augment_event_data()
2022-05-12 21:59:38 +02:00
D. Berge
7cea79a9be Add database upgrade file 24.
This redefines augment_event_data() to use interpolation rather than
nearest neighbour. It now takes an argument indicating the maximum
allowed interpolation timespan. An overload with a default of ten
minutes is also provided, as an in situ replacement for the previous
version.

The ten minute default is based on Triggerfish headers behaviour seen
on crew 248 during soft starts.
2022-05-12 21:58:51 +02:00
D. Berge
69f565f357 Update database templates to v0.3.10.
* Add interpolate_geometry_from_tstamp()
2022-05-12 21:52:31 +02:00
D. Berge
23de4d00d7 Add database upgrade file 23.
This defines a interpolate_geometry_from_tstamp(), taking a timestamp
and a maximum timespan in seconds. It will then interpolate a position
at the exact timestamp based on data from real_time_inputs, provided
that the effective interpolation timespan does not exceed the maximum
requested.

Fixes #243.
2022-05-12 21:51:00 +02:00
D. Berge
1992efe914 Update database templates to v0.3.9.
* Add replace_placeholders()
* Add scan_placeholders() procedure
2022-05-12 21:47:38 +02:00
D. Berge
c7f3f565cd Add database upgrade file 22.
This defines a replace_placeholders() function, taking as arguments
a text string and either a timestamp or a sequence / point pair. It
uses the latter arguments to find metadata from which it can extract
relevant information and replace it into the text string wherever the
appropriate placeholders appear. For instance, given a call such as
replace_placeholders('The position is @POS@', NULL, 11, 2600) it will
replace '@POS@' with the position of point 2600 in sequence 11, if it
exists (or leave the placeholder untouched otherwise).

A scan_placeholders() procedure is also defined, which calls the above
function on the entire event log.

Fixes #229.
2022-05-12 21:45:56 +02:00
D. Berge
1da02738b0 Update database templates to v0.3.8.
* Add event_position()
* Add event_meta()
2022-05-12 21:40:23 +02:00
D. Berge
732d8e9be6 Add database upgrade file 21.
This adds event_position() and event_meta() functions which are used
to retrieve position or metadata, respectively, given either a timestamp
or a sequence / point pair. Intended to be used in the context of #229.
2022-05-12 21:38:28 +02:00
D. Berge
a2bd614b17 Update database templates.
* Optimise public.geometry_from_tstamp()
* Remove index on public.real_time_inputs.meta->>'tstamp'
* Fix adjust_planner()
2022-05-10 21:57:53 +02:00
D. Berge
003c833293 Add database upgrade file 20.
This updates the adjust_planner() procedure to take into account the
new events schema (the `event` view has been replaced by `event_log`).

Fixes #208.
2022-05-10 21:54:46 +02:00
D. Berge
a4c458dc16 Add database upgrade file 19.
Rewrites geometry_from_tstamp() to make it more efficient.

Fixes #241.
2022-05-10 21:52:24 +02:00
D. Berge
f7b6ca3f79 Log runner output to syslog (if so configured).
The variable DOUGAL_LOG_FACILITY must be defined in the environment
(e.g., in ~/.dougalrc) for syslog to be enabled.
2022-05-08 15:30:05 +02:00
D. Berge
a7cce69c81 Add logging statements 2022-05-08 15:26:15 +02:00
D. Berge
2b20a5d69f Update line details on reimport conflict.
To deal with misnamed lines.

Fixes #240.
2022-05-08 15:25:11 +02:00
D. Berge
4fc5d1deda Add links to first / last page.
Fixes #237.
2022-05-07 14:58:16 +02:00
D. Berge
df13343063 Colour map QC events according to their labels.
We take the first label associated with the event (if any) and use
the label's colour for the event marker. We override the colour for
QC events and use a default value for events with no labels or if
the label does not have an associated colour.
2022-05-07 12:07:03 +02:00
D. Berge
a5603cf243 Fix detection of map QC events.
Fixes #236.
2022-05-07 12:05:56 +02:00
D. Berge
b6d4236325 Make prime data stand out.
Fixes #228.
2022-05-06 18:07:09 +02:00
D. Berge
7e8f00d9f2 Explicitly label comment sections in default template 2022-05-06 17:15:09 +02:00
D. Berge
721cfb36d1 Use timestamp from message payload if it has one.
Fixes #221.
2022-05-06 15:17:10 +02:00
D. Berge
222c951e49 Add debugging to navdata/save.
To help track down #221.
2022-05-06 14:31:06 +02:00
D. Berge
45d2e56ed1 Add debug() module.
It uses https://github.com/debug-js/debug but it is meant to be
called like this:

const debug = require("DOUGAL_ROOT/debug")(__filename);

That way the calling module's path is used as the debug namespace.
2022-05-06 14:11:31 +02:00
D. Berge
c5b6c87278 Add DOUGAL_ROOT symlink to node_modules.
This can be used as a shortcut when requiring a module from deep
within the file hierarchy, e.g., instead of:

require("../../../../lib/db");

one can do:

require("DOUGAL_ROOT/lib/db");
2022-05-06 14:08:19 +02:00
D. Berge
fd37e8b8d6 Add context option to accept/unaccept QCs.
Closes #220.
2022-05-04 19:45:20 +02:00
D. Berge
ce0310d0b0 Silence error on non-existent label definition 2022-05-04 19:42:53 +02:00
D. Berge
546bc45861 Remove dead code 2022-05-04 18:35:20 +02:00
D. Berge
602f2c0a34 Merge branch '215-flag-unflag-qc-results-as-accepted' into 'devel'
Resolve "Flag / unflag QC results as accepted"

Closes #215

See merge request wgp/dougal/software!28
2022-05-04 16:32:48 +00:00
D. Berge
37de5ab223 Implement UI for flagging QCs as accepted or unaccepted 2022-05-04 18:21:42 +02:00
D. Berge
d69c6c4150 Add DougalQcAcceptance Vue.js component.
Widget for use in the QC view to show controls for accepting or
unaccepting QCs.
2022-05-04 18:20:28 +02:00
D. Berge
d80f44547b Update API description 2022-05-04 18:13:14 +02:00
D. Berge
6c8515a879 Add QC results accept/unaccept API endpoints 2022-05-04 18:11:05 +02:00
D. Berge
bb9340a0af Add QC results accept/unaccept middleware.
This middleware can only deal with shot QCs, not sequence-wide QCs.
2022-05-04 17:22:18 +02:00
D. Berge
672c14fb67 Add functions to accept/unaccept QCs.
These are only able to deal with shot QCs. At this point, sequence-wide
QCs cannot be marked as accepted.
2022-05-04 17:19:20 +02:00
D. Berge
f4ee798bf0 Implement endpoint for QC deletion.
Closes #217.
2022-05-04 17:15:28 +02:00
D. Berge
c8ef089b28 Log speed value on Hydronav error.
Related to #206.
2022-05-03 23:58:42 +02:00
D. Berge
1f6d560d7e Style log events according to online/offline status.
Strictly speaking, it doesn't consider (or know) what the shooting
status is (but see #214). All it does is colour events differently
if they have all three of: sequence, point and timestamp.

This is probably good enough for the time being to close #134.
2022-05-03 23:42:58 +02:00
D. Berge
f37e07796c Change description of QC test.
It's not an error but only a warning.
2022-05-03 17:27:34 +02:00
D. Berge
349c052db0 Use all sequences to build QC tree.
Fixes #213.
2022-05-03 17:23:50 +02:00
D. Berge
1c291db6c6 Add database upgrade file 18.
* Adds label_in_sequence() function

NOTE: This function is already defined in schema-template.sql but
seemingly never got pushed into production.

Fixes #211.
2022-05-02 13:40:33 +02:00
D. Berge
f46fd4b6bc Cope with non-existing configuration paths.
Fixes #212.
2022-05-02 13:15:41 +02:00
D. Berge
10883eb1a6 Check for invalid speed values in Hydronav header.
Related to #206. If this is indeed what is causing the alerts,
we will change the logic so that it simply logs (or ignores)
invalid speeds rather than throwing.
2022-05-02 13:09:43 +02:00
D. Berge
af6e419aab Run QCs from runner.
When importing an old project, the first QC run could take a while
and cause a bit of backlog, but during normal shooting it is expected
that it will finish quite quickly (and this is monitored anyway).
2022-05-01 21:26:10 +02:00
D. Berge
6516896bae Disable system imports in runner.
They're not really used. Will probably remove at a later date.
2022-05-01 21:24:56 +02:00
D. Berge
c495dce27d Don't show event history widget for guests.
NOTE: guests still do have access to the relevant API endpoint.
In theory, a persistent and computer literate guest user could
visit the API endpoint directly and retrieve the edit history.
As the edit history may need to be given to users who otherwise
do not have write access, it is considered quite acceptable to
allow guest users to access the endpoint.

Closes #194.
2022-05-01 21:20:52 +02:00
D. Berge
40d96230d2 Adjust planner times from runner.
Fixes #167.
2022-05-01 20:27:19 +02:00
D. Berge
d607b4618a Merge branch '182-periodically-scan-the-events-table-for-missing-information' into 'devel'
Resolve "Periodically scan the events table for missing information"

Closes #182

See merge request wgp/dougal/software!26
2022-05-01 18:24:35 +00:00
D. Berge
fd41d2a6fa Launch database housekeeping tasks from runner 2022-05-01 20:10:27 +02:00
D. Berge
39690c991b Update database templates.
* Add index on public.real_time_inputs.meta->>'tstamp'
* Add public.geometry_from_tstamp()
* Add augment_event_data()
2022-05-01 19:47:16 +02:00
D. Berge
09ead4878f Add database upgrade file 17 2022-05-01 19:46:04 +02:00
D. Berge
588d210f24 Fix reporting for “gun pressures” QC test.
Fixes #205.
2022-04-30 17:37:38 +02:00
D. Berge
28be86e7ff Graphs view: delay “no sequences” message until loaded.
Related to #196.
2022-04-30 16:14:32 +02:00
D. Berge
1eac97cbd0 Change “No fire” QC definition 2022-04-30 16:13:12 +02:00
D. Berge
e3a3bdb153 Clean up whitespace.
Commands used:

find . -type f -name '*.js'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.vue'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.py'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
2022-04-29 14:48:21 +02:00
D. Berge
0e534b583c Do not assume that lines have remarks.
Fixes #202.
2022-04-29 14:32:46 +02:00
D. Berge
51480e52ef Recognise "dark", "light" label view attributes.
In a label definition (in etc/surveys/*.yaml) we can now have
"dark" or "light" attributes under "view" to force the label
text to always use either the dark or light theme. This is
useful when a label's colour causes a bad contrast in either
theme.

Example:

  labels:
      Daily:
          view:
              colour: "#EFEBE9"
              description: "Of interest in the daily report"
              light: true # Text always displayed in a dark colour
          model:
              user: true
              multiple: true
2022-04-29 12:18:09 +02:00
D. Berge
187807cfb1 Enable Save button as soon as the remarks are changed.
Closes #199.
2022-04-27 19:45:26 +02:00
D. Berge
d386b97e42 Database upgrade 16: fix event edits.
Fixes #198.
2022-04-27 17:41:53 +02:00
D. Berge
da578d2e50 Fix project_summary view returning unwanted rows.
Fixes #197.
2022-04-27 10:49:46 +02:00
D. Berge
7cf89d48dd Fix whitespace 2022-04-26 17:41:48 +02:00
D. Berge
c0ec8298fa Don't try to show QC graphs on a new project.
If there are no sequences, just show a message to the effect.

Fixes #196.
2022-04-26 17:39:59 +02:00
D. Berge
68322ef562 Fix misleading comment.
Use an EPSG code that is actually in the work area of the Dougal boats.
2022-04-26 17:36:48 +02:00
D. Berge
888228c9a2 Do not crash if a project doesn't have QCs defined.
Fixes #195.
2022-04-26 14:50:34 +02:00
D. Berge
74d6f0b9a0 Accept mime query parameter 2022-04-16 17:18:04 +02:00
D. Berge
cf475ce2df Adapt middleware to new database schema.
As introduced by commit 0c6567d8f8.
2022-04-16 17:18:04 +02:00
D. Berge
26033b2a37 Fix syntax error.
Introduced by commit ead938b40f.
2022-04-13 09:04:52 +02:00
D. Berge
fafd4928d9 Fix Marked call (adapt to new Marked version) 2022-04-13 08:18:21 +02:00
D. Berge
ec38fdb290 Pin package sass version to avoid annoying warning 2022-03-18 20:07:50 +01:00
D. Berge
086172c5e7 Upgrade dependencies.
This is a conservative upgrade.

The upgraded version of leaflet-arrowheads uses optional chaining which
seems to cause webpack to choke, so added to "transpileDependencies" in
vue.config.js.

Closes #189.
2022-03-18 16:29:50 +01:00
D. Berge
3db453a271 Add keys to v-for loops 2022-03-18 16:15:06 +01:00
D. Berge
a5db9c984b Show sequence comments in log page 2022-03-18 15:05:08 +01:00
D. Berge
ead938b40f Inhibit exports.
They don't seem to be used, and for backups it's better to
just back up the whole database instead, which is being done
remotely.
2022-03-18 13:32:43 +01:00
D. Berge
634a7be3f1 Merge branch '184-refactor-qcs' into devel 2022-03-17 20:12:15 +01:00
D. Berge
913606e7f1 Allow forcing QCs.
QCs may be re-run for specific sequences or for a whole
project by defining an environment variable, as follows:

For an entire project:

* DOUGAL_FORCE_QC="project-id"

For specific sequences:

* DOUGAL_FORCE_QC="project-id sequence1 sequence2 … sequenceN"
2022-03-17 20:10:26 +01:00
D. Berge
49b7747ded Remove *all* QC events when saving sequence results.
When saving shot-by-shot results for a sequence,
*all* existing QC events for that sequence will be
removed first.

We do this because otherwise we may end up with QC
data for shots that no longer exist. Also, in the
case that we have QCed based on raw data, QC results
for shots which are not in the final data would stay
around even though those shots are no longer valid.
2022-03-17 20:07:11 +01:00
D. Berge
1fd265cc74 Update dependencies 2022-03-17 20:05:07 +01:00
D. Berge
13389706a9 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:43:38 +01:00
D. Berge
818cd8b070 Add pg-cursor dependency, needed by QCs 2022-03-17 18:43:12 +01:00
D. Berge
a3d3c7aea7 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:37:14 +01:00
D. Berge
a592ab5f6c Use digests rather than timestamps for QC execution.
Using timestamps does not work as we might be
importing files with timestamps older than the
last QC run. Those would not be detected by a
timestamp based method but would be by this
digest based approach.

There is a project-wide digest and per sequence
digests. The former takes the path and hashes of
all files known to Dougal for this project (the
`files` table), concatenantes them and computes
the MD5 checksum. Sequence digests do the same
but only including the files related to that
sequence.
2022-03-17 18:32:09 +01:00
D. Berge
9b571ce34d Merge branch '138-keep-edit-history-of-event-log-entries' into devel 2022-03-16 21:31:38 +01:00
D. Berge
aa2b158088 Remove spurious actions from DB template 2022-03-16 21:30:32 +01:00
D. Berge
0d1f2b207c Apply changes from 38e4e705a4 to DB schema template 2022-03-16 21:29:53 +01:00
D. Berge
38e4e705a4 Modify database upgrade file 12.
Two function that were dependent on the `events` view were
changed to work with `event_log` instead.
2022-03-16 21:08:42 +01:00
D. Berge
82d7036860 Merge branch '138-keep-edit-history-of-event-log-entries' into 'devel'
Resolve "Keep edit history of event log entries"

Closes #78, #101, #138, #141, #170, #172, and #181

See merge request wgp/dougal/software!20
2022-03-15 13:25:43 +00:00
D. Berge
0727e7db69 Update database templates to schema v0.3.1 2022-03-15 14:17:28 +01:00
D. Berge
2484b1c473 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:37:27 +01:00
D. Berge
750beb5c02 Add explicit indication of all tests passed 2022-03-09 21:36:49 +01:00
D. Berge
cd2e7bbd0f Merge branch '184-refactor-qcs' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:26:40 +01:00
D. Berge
21d5383882 Update QC check definitions 2022-03-09 21:25:47 +01:00
D. Berge
2ec484da41 Fix detection of sequence modification time 2022-03-09 21:25:04 +01:00
D. Berge
648ce9970f Interpolate timestamps for non-existing shotpoints 2022-03-09 21:22:33 +01:00
D. Berge
fd278a5ee6 Add database function: tstamp_interpolate 2022-03-09 21:21:48 +01:00
D. Berge
4f5cce33fc Add comments to database functions 2022-03-09 21:21:01 +01:00
D. Berge
53bb75a2c1 Add new database upgrade file 11.
Some of the things in new upgrade file 12 depend
on the functions defined here.
2022-03-09 19:07:58 +01:00
D. Berge
45595bd64f Rename database upgrades 11‒13 → 12‒14 2022-03-09 19:07:58 +01:00
D. Berge
af4d141c6a Merge branch '184-refactor-qcs' into '138-keep-edit-history-of-event-log-entries'
Resolve "Refactor QCs"

See merge request wgp/dougal/software!22
2022-03-09 17:46:20 +00:00
D. Berge
bef2be10d2 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into '184-refactor-qcs'
Resolve "Adapt QC results view to new API endpoints"

See merge request wgp/dougal/software!24
2022-03-09 16:56:35 +00:00
D. Berge
803a08a736 Merge branch '187-create-qc-results-api-endpoints' into '184-refactor-qcs'
Resolve "Create QC results API endpoints"

See merge request wgp/dougal/software!23
2022-03-09 16:55:57 +00:00
D. Berge
c86cbdc493 Refactor QC view to use new API endpoint.
This provides essentially the same user experience as the old
endpoint, with one exception as of this commit:

* The user is not able to “accept” or “unaccept” QC events.
2022-03-09 17:50:55 +01:00
D. Berge
186615d988 Add comments for ease of browsing 2022-03-09 17:43:51 +01:00
D. Berge
666f91de18 Add QC results API endpoint 2022-03-09 17:43:10 +01:00
D. Berge
c8ce786e39 Add API middleware for returning QC results 2022-03-09 17:41:27 +01:00
D. Berge
73cb26551b Add library functions for getting QC results from DB.
We return the QC definitions tree structure, augmented with
a `sequences` attribute which contains `raw_lines` tuples
which are in turn augmented with a `shots` attribute
containing `event_log` tuples. The whole structure looks
something like:

qc_test:
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
      - sequence1:
          shots: [sp0, sp1, …]
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
  …
2022-03-09 17:35:12 +01:00
D. Berge
d90acb1aeb Add utility to convert QC definitions tree into a flat list 2022-03-09 17:32:23 +01:00
D. Berge
14a2f57c8d Refactor QC execution and results saving.
The results are now saved as follows:

For shot QCs, failing tests result in an event being created in
the event_log table. The text of the event is the QC result message,
while the labels are as set in the QC definition. It is conventionally
expected that these include a `QC` label. The event `meta` contains a
`qc_id` attribute with the ID of the failing QC.

For sequences, failing tests result in a `meta` entry under `qc`, with
the QC ID as the key and the result message as the value.

Finally, the project's `info` table still has a `qc` key, but unlike
with the old code, which stored all the QC results in a huge object
under this key, now only the timestamp of the last time a QC was run on
this project is stored, as `{ "updatedOn": timestamp }`.

The QCs are launched by calling the main() function in /lib/qc/index.js.
This function will first check the timestamp of the files imported into
the project and only run QCs if any of the file timestamps are later
than `info.qc.updatedOn`. Likewise, for each sequence, the timestamp of
the files conforming that sequence is checked against
`info.qc.updatedOn` and only those which are newer are actually
processed. This cuts down the running time very considerably.

The logic now is much easier on memory too, as it doesn't load the
whole project at once into memory. Instead, shotpoint QCs are processed
first, and for this a cursor is used, fetching one shotpoint at a
time. Then the sequence QCs are run, also one sequence at a time
(fetched via an individual query touching the `sequences_summary` view,
rather than via a cursor; we reuse some of the lib/db functions here),
for each sequence all its shotpoints and a list of missing shots are
also fetched (via lib/db function reuse) and passed to the QC functions
as predefined variables.

The logic of the QC functions is also changed. Now they can return:

* If a QC passes, the function MUST return boolean `true`.

* If a QC fails, the function MAY return a string describing the nature
  of the failure, or in the case of an `iterate: sequence` type test,
  it may return an object with these attributes:

  - `remarks`: a string describing the nature of the failure;
  - `labels`: a set of labels to associate with this failure;
  - `shots`: a object in which each attribute denotes a shotpoint number
    and the value consists of either a string or an object with
`remarks` (string), `labels` (array of strings) attributes. This allows
us to add detail about which shotpoints exactly contribute to cause a
sequence-wide test failure (this may not be applicable to every
sequence-wide QC) and it's also a handy way to detect and insert events
for missing shots.

* For QCs which may give false positives, such as missing gun data, a
  new QC definition attribute is introduced: if `ignoreAllFailed` is
boolean `true` and all shots fail the test for a sequence, or all
sequences fail the test for a prospect, the results of the QC will be
ignored, as if the test had passed. This is mostly to deal with gun or
any other data that may be temporarily missing.
2022-03-07 21:41:10 +01:00
D. Berge
67f8b9c6dd Bypass permissions check on info.put() if role is null.
The comparison is strict non-equality so a null role cannot
be forced via the API.

The need for this is so that we can reuse this function to
save QC results, which is something that does not take
place over the API.
2022-03-07 21:20:21 +01:00
D. Berge
d3336c6cf7 Add fetchRow DB function.
Helper function to fetch a row at a time using a cursor.
2022-03-07 21:16:43 +01:00
D. Berge
17bb88faf4 Cope with P1/11s with no S records 2022-03-07 21:08:22 +01:00
D. Berge
a52c7e91f5 Document in runner.sh how to run ASAQC in test mode 2022-03-07 21:07:20 +01:00
D. Berge
8debe60d5c Cope with undefined labels 2022-03-02 19:39:29 +01:00
D. Berge
ee9a33513a Update database README 2022-02-28 21:27:20 +01:00
D. Berge
723c9cc166 Make it possible to repeatedly apply DB upgrade 11.
Even though this makes PostgreSQL 14 a hard dependency.
2022-02-28 21:26:19 +01:00
D. Berge
cb952d37f7 Fix: do not require file that no longer exists 2022-02-28 21:25:00 +01:00
D. Berge
d5fc04795d Make rows dense.
This should probably be turned into an option controlled by the
user.
2022-02-27 19:59:06 +01:00
D. Berge
4e0737335f Add row context menu.
It replaces the `Actions` column in the old table and provides
more actions.

The user can now edit not just the comments and labels but also
the timestamp / shotpoint as requested in #78 (closes #78).

Because events are grouped by timestamp / shotpoint (each row
represents a unique timestamp or shotpoint), the behaviour is
slightly different depending on whether the user clicks on a
row containing a single (editable) event, or on one of multiple
editable events in the same row. Also, rows containing only
read-only events are recognised and no edition actions are
provided for those.
2022-02-27 19:59:06 +01:00
D. Berge
d47c8a9e10 Add (disabled) active row highlighter.
It implements the same functionality as in other tabs
such as sequences, lines, etc., but it is disabled here
because in my opinion it doesn't look too nice.

It will probably be a matter of enabling it at some point
and asking for feedback on user preference.
2022-02-27 19:56:21 +01:00
D. Berge
7ea0105d9f Add popularLabels computed property.
Returns a list of labels used in the current view,
in order of popularity (most used first).

NOTE: this property is not actually used. It's
technically dead code.
2022-02-27 19:56:21 +01:00
D. Berge
8f4bda011b Add dialogue to edit event labels.
This assumes that adding or removing labels is a relatively
common action to do on an event and provides a quicker
and simpler mechanism than bringing up the full event
dialogue.

This is meant to be invoked from a context menu action or
similar.
2022-02-27 19:56:21 +01:00
D. Berge
48505dbaeb View event history.
When an event has been modified, this control opens a dialogue
where the previous version of the event may be reviewed and if
necessary restored.

Technically, this was the quid of and closes #138.
2022-02-27 19:56:21 +01:00
D. Berge
278c46f975 Adapt events view to new schema 2022-02-27 19:56:21 +01:00
D. Berge
180343754a Remove old event edit dialogue 2022-02-27 19:56:21 +01:00
D. Berge
9aa9ce979b Replace event edit dialogue.
The old <dougal-event-edit-dialog/> gets replaced by
<dougal-event-edit/> which handles the new events schema.
2022-02-27 19:56:21 +01:00
D. Berge
1e5be9c655 Add new event edit dialogue.
Replaces <dougal-event-edit-dialog/>.
2022-02-27 19:56:21 +01:00
D. Berge
0be5dba2b9 Return also labels from <dougal-context-menu/>.
Keeping in mind that the input model is a tree and labels
may be at any level in the tree, not just in the leaves.
2022-02-27 19:56:21 +01:00
D. Berge
0c91e40817 Fix <dougal-context-menu/> default prop value 2022-02-27 19:56:21 +01:00
D. Berge
c1440c7ac8 Simplifiy <dougal-context-menu/> model 2022-02-27 19:56:21 +01:00
D. Berge
606f18c016 Add Vuex position and timestamp getters for real-time event 2022-02-27 19:56:21 +01:00
D. Berge
febf109cce Update API description 2022-02-27 19:56:21 +01:00
D. Berge
9b700ffb46 Update required database schema 2022-02-27 19:56:21 +01:00
D. Berge
9aca927e49 Update version checking mechanism.
Checks both database schema and API versions.
2022-02-27 19:56:21 +01:00
D. Berge
adaa1a6b8a Add version number to API 2022-02-27 19:56:21 +01:00
D. Berge
8790a797d9 Allow restricting by timestamp or position.
Closes #181.
2022-02-27 19:56:21 +01:00
D. Berge
d7d75f34cd Remove event caching.
That was a horrible kludge and should not be necessary with the
new schema, which is simpler and much faster.
2022-02-27 19:56:21 +01:00
D. Berge
950582a5c6 Refactor event middleware and db code to use new tables 2022-02-27 19:56:21 +01:00
D. Berge
d0da1b005b Add replaceMarkers utility function 2022-02-27 19:56:21 +01:00
D. Berge
1e2c816ef3 Add database upgrade file 13.
Drops the old event tables.

NOTE: consider not applying this patch until confident that
the migration has proceeded smoothly. Dougal can operate just
fine without it.
2022-02-27 19:56:21 +01:00
D. Berge
54b457b4ea Add database upgrade file 12.
Migrates data from old event tables to new.
2022-02-27 19:56:21 +01:00
D. Berge
4d2efd1e04 Move sequence events middleware to a different path.
This is to make room for a new endpoint to retrieve
data for individual events.
2022-02-27 19:56:21 +01:00
D. Berge
920ea83ece Add API endpoint to retrieve a single shotpoint.
This will be used by the new event dialogue in the
frontend to get shotpoint information when creating
or editing events.
2022-02-27 19:56:21 +01:00
D. Berge
d33fe4e936 Add database utilities file.
Intended to contain reusable functions.
2022-02-27 19:56:21 +01:00
D. Berge
c347b873c5 Update database README.
Add information on restoring from backup and troubleshooting
details when migrating PostgreSQL versions.
2022-02-27 19:56:21 +01:00
D. Berge
0c6567d8f8 Add database upgrade file 11 2022-02-27 19:56:12 +01:00
D. Berge
195741a768 Merge branch '173-do-not-use-inodes-as-part-of-a-file-s-fingerprint' into 'devel'
Resolve "Do not use inodes as part of a file's fingerprint"

Closes #173

See merge request wgp/dougal/software!19
2022-02-07 16:08:04 +00:00
D. Berge
0ca44c3861 Add database upgrade file 10.
NOTE: this is the first time we modify the actual data
in the database, as opposed to adding to the schema.
2022-02-07 17:05:19 +01:00
D. Berge
53ed096e1b Modify file hashing function.
We remove the inode from the hash as it is unstable when the
files are on an SMB filesystem, and replace it with an MD5
of the absolute file path.
2022-02-07 17:03:10 +01:00
D. Berge
75f91a9553 Increment schema wanted version 2022-02-07 17:02:59 +01:00
D. Berge
40b07c9169 Merge branch '175-add-database-versioning-and-migration-mechanism' into 'devel'
Resolve "Add database versioning and migration mechanism"

Closes #175

See merge request wgp/dougal/software!18
2022-02-07 14:43:50 +00:00
D. Berge
36e7b1fe21 Add database upgrade file 09 2022-02-06 23:26:57 +01:00
D. Berge
e7fa74326d Add README to database upgrades directory 2022-02-06 23:24:24 +01:00
D. Berge
83be83e4bd Check database schema compatibility.
The server will not start unless it satisfies itself that we're
running against a compatible database schema.
2022-02-06 22:52:45 +01:00
D. Berge
81ce6346b9 Add database schema information to package.json.
Used to determine if the actual schema on the database
is compatible with the version of the server we're
attempting to run.
2022-02-06 22:51:25 +01:00
D. Berge
923ff1acea Add more details to package.json 2022-02-06 22:50:44 +01:00
D. Berge
8ec479805a Add version reporting library.
This reports the current server version, from Git by
default.

Also, and of more interest, it reports whether the
current database schema is compatible with the
server code.
2022-02-06 22:48:20 +01:00
D. Berge
f10103d396 Enfore info key access restrictions on the API.
Obviously, those keys can be edited freely at the database
level. This is intended.
2022-02-06 22:40:53 +01:00
D. Berge
774bde7c00 Reserve certain keys on info tables 2022-02-06 22:39:11 +01:00
D. Berge
b4569c14df Update database README.
Document how to create a Dougal database from scratch
and how to update PostgreSQL.
2022-02-06 22:28:21 +01:00
D. Berge
54eea62e4a Fix require path 2022-02-06 14:24:25 +01:00
D. Berge
69c4f2dd9e Merge branch '161-transfer-files-to-asaqc' into 'devel'
Resolve "Transfer files to ASAQC"

Closes #161

See merge request wgp/dougal/software!16
2021-10-09 09:23:54 +00:00
D. Berge
acc829b978 Switch to production URL in ASAQC configuration 2021-10-06 04:16:17 +02:00
D. Berge
ff4913c0a5 Instrument getLineName to monitor probable cause of #165 2021-10-06 02:12:05 +02:00
D. Berge
51452c978a Add ASAQC task to runner 2021-10-04 21:26:13 +02:00
D. Berge
927ef71ecc Send Ocp-Apim-Subscription-Key with ASAQC requests 2021-10-04 21:00:41 +02:00
D. Berge
14541bcb95 Make code compatible with NodeJS 14 2021-10-04 16:52:04 +02:00
D. Berge
5c190e5554 Add ASAQC queue processor.
This code implements the backend processing side
of the ASAQC queue, i.e., the bit that communicates
with the remote API.

Its expected use it to have it running at regular
intervals, e.g., via cron. The entry point is:

lib/www/server/queues/asaqc/index.js

That file is executable and can be run directly
from the shell or within a script. Read the comments
in that file for further instructions.
2021-10-04 02:21:00 +02:00
D. Berge
0f447fc27d Add ASAQC API mock-up.
To be used for testing and debugging. See
index.js for instructions.
2021-10-04 02:21:00 +02:00
D. Berge
dfbccf3bc6 Add ASAQC (test) server details to configuration.
The URL corresponds to that of a built-in test server.

Note that the /etc/ssl directory is protected against
accidental inclusion into the repository by commit
458b6837. The TLS private key should *never* be
committed.
2021-10-04 02:21:00 +02:00
D. Berge
a491530018 Add ASAQC transfer support to client (sequence list) 2021-10-04 02:21:00 +02:00
D. Berge
c7784aa52f Add ASAQC queue endpoints to API 2021-10-04 02:21:00 +02:00
D. Berge
0533314b01 Add DOUGAL_ROOT property to configuration object 2021-10-04 02:21:00 +02:00
D. Berge
8da664a025 Add directory for TLS certificates.
And add it to .gitignore so its contents do not get committed
by accident.
2021-10-04 02:21:00 +02:00
D. Berge
6debf5c355 Add queue-related functions to the database interface.
These functions, in general following the same HTTP-verb
approach as the rest of the database interface, are for
use with both the HTTP API and the queue processor.
2021-10-04 02:21:00 +02:00
D. Berge
db8efce346 Remove dead code 2021-10-04 02:21:00 +02:00
D. Berge
b107c71c6f Add option to get only summary info for a sequence.
Which is faster when we don't need the shotpoint data.
2021-10-04 02:21:00 +02:00
D. Berge
ef12168811 Make it possible to list one specific sequence 2021-10-04 02:21:00 +02:00
D. Berge
e1dc970db4 Add export functions for SeisJSON data.
These functions abstract the creation of SeisJSON payloads
and their various representations as GeoJSON, HTML or PDF.
2021-10-04 02:21:00 +02:00
D. Berge
f2de8509cc Make Babel support logical assignment operators.
That's ||=, &&=, ^^=, and the like.
2021-10-04 02:21:00 +02:00
D. Berge
1e6c6ef961 Add throttle() helper.
Useful to avoid repeated updates triggered by
incoming row-level database events.
2021-10-04 02:21:00 +02:00
D. Berge
38e56394d4 Add queue_items to the list of DB events to listen for 2021-10-04 02:21:00 +02:00
D. Berge
374fb7de67 Add database upgrade file 08 2021-10-04 02:21:00 +02:00
D. Berge
978256ceab Describe ASAQC-related API endpoints 2021-10-04 02:21:00 +02:00
D. Berge
5a7fe9b38a Update API version description 2021-10-04 02:21:00 +02:00
D. Berge
83c992c0d9 Fix description of endpoints authorisation 2021-10-04 02:21:00 +02:00
D. Berge
18ee28d72e Describe HTTP 401 responses explicitly 2021-10-04 02:21:00 +02:00
D. Berge
6bc3aff587 Change server names in API description 2021-10-04 02:21:00 +02:00
D. Berge
74b3de5c26 Merge branch '75-quality-control-dashboard' into 'devel'
Resolve "Quality control dashboard" – sequence visualisations

Closes #143, #142, and #150

See merge request wgp/dougal/software!14
2021-10-01 21:17:17 +00:00
D. Berge
57a08c93bc Add link to graphics tab from sequence list 2021-09-28 22:16:12 +02:00
D. Berge
fabc9fe757 Do not make graphs editable 2021-09-28 18:30:26 +02:00
D. Berge
6f32f24481 Add configuration dialog to Graphs.
Lets the user choose which aspects (graphs) he wants to
be visible.
2021-09-28 18:17:38 +02:00
D. Berge
dffe7defbb Add tooltips to Graphs toolbar 2021-09-28 18:16:57 +02:00
D. Berge
b9844528f1 Add graphBar to resizeObserver.
This ensures that it is always the right size when it first
gets displayed.
2021-09-28 18:15:19 +02:00
D. Berge
cd78dbd0d8 Fix typos in resizeObserver 2021-09-28 18:14:39 +02:00
D. Berge
798203be9f Add preferences support to DougalGraphGunsPressure 2021-09-28 18:12:37 +02:00
D. Berge
5bfd7dc835 Add preferences support to DougalGraphGunsDepth 2021-09-28 18:11:43 +02:00
D. Berge
c17862fbbb Add preferences support to DougalGraphGunsTiming 2021-09-28 18:11:04 +02:00
D. Berge
04c0369923 Add preferences support to DougalGraphArraysIJScatter 2021-09-28 18:10:08 +02:00
D. Berge
026cfb6f98 Rename GraphArraysIJScatter to DougalGraphArraysIJScatter 2021-09-28 18:08:48 +02:00
D. Berge
a4e6ec0712 Add support for personalising QC graph settings.
Preferences are read from the store and passed to graph components
via the `settings` prop. Component may changed their own settings
by emitting the `update:settings` signal.
2021-09-28 17:59:32 +02:00
D. Berge
b3e052cb12 Add utility function to filter preferences by a prefix 2021-09-28 17:53:07 +02:00
D. Berge
cf88ecf172 Save user preferences to Vuex store.
The user preferences are saved in the browser's localStorage and
read by setCredentials() whenever that function is called. From
that point they are cached in the Vuex store.

Provided that preferences are only modified through the store,
via the saveUserPreference() call, the preferences should always
be in sync between the store and the browser.

The preferences object is a key/value store. Each key is
expected to be in the form of a series of dot-separated prefixes,
e.g., `UserX.RoleY.Graphs.GraphType1.setting0`.

For user preferences, the first two prefix elements should be the
username and role of the user that the setting applies to. These will
be automatically added and stripped by saveUserPreference() and
loadUserPreferences() respectively.
2021-09-28 17:42:49 +02:00
D. Berge
e267440711 Move comment to right place 2021-09-28 17:30:48 +02:00
D. Berge
454094b187 Refactor gun heatmaps component.
Fixes #150.

Contributes towards the goal of #149. As irrelevant data (such
as for non-firing guns) is no longer shown at all. This affects:

* Firetime (only active array data shown)
* Gun deltas (only active array shown)
* Fill time (only non-active array shown)
2021-09-21 00:32:00 +02:00
D. Berge
862e754a6f Fix labelling of gun mode and detect heatmaps.
Fixes #142.
2021-09-20 00:18:31 +02:00
D. Berge
894877750e Make heatmap hover box more informative.
Closes #143.
2021-09-20 00:17:35 +02:00
D. Berge
09b45d5d65 Swap outlier colours 2021-09-11 21:30:12 +02:00
D. Berge
1352c3b312 Make graph colours consistent for port / starboard elements 2021-09-11 19:19:58 +02:00
D. Berge
30aa2c302e Add graphic aesthetics 2021-09-11 12:38:12 +02:00
D. Berge
3eaa2757b9 Add Graphs tab to navigation bar 2021-09-11 12:19:06 +02:00
D. Berge
6f6af1bbc7 Add graphs/ route to client 2021-09-11 12:19:06 +02:00
D. Berge
019561229c Add Graph component.
It displays a series of data plots.
2021-09-11 12:19:06 +02:00
D. Berge
e212dc8b92 Add unpack helper function to frontend.
Convenience function to extract a key from an
array of objects.
2021-09-11 12:19:06 +02:00
D. Berge
5c00013892 Add graphic library dependencies 2021-09-11 12:19:06 +02:00
D. Berge
1e5bdcc068 Add Vuex functions to load / save user preferences 2021-09-11 12:19:06 +02:00
D. Berge
a280a910f5 Add database upgrade file 07 2021-09-11 12:19:06 +02:00
D. Berge
45fe467a21 Implement sequence/get API endpoint.
It returns data for all individual points in a sequence.
2021-09-11 12:19:06 +02:00
D. Berge
8d3b7adc78 Show azimuths to two decimals in SeisJSON exports 2021-09-04 23:34:53 +02:00
D. Berge
079d3a18b0 Merge branch '131-show-missing-shots-in-sequence-reports' into 'devel'
Resolve "Show missing shots in sequence reports"

Closes #131

See merge request wgp/dougal/software!15
2021-09-04 21:32:44 +00:00
D. Berge
f0b1fc2fe6 Show missed shot events in HTML, PDF exports 2021-09-04 23:29:58 +02:00
D. Berge
987bdf6e21 Add option to export missing shots as SeisJSON events 2021-09-04 23:28:43 +02:00
D. Berge
1d3507b3a4 Export missing shots by default.
Unless explicitly requested by the user by setting the
option `missing` to `false`, a list of missing shotpoints
will be included in the SeisJSON file.
2021-09-04 23:19:25 +02:00
D. Berge
a82fc7bc8a Recover from feed XML parsing error 2021-09-04 02:43:58 +02:00
D. Berge
29b3c9a250 Show azimuth to two decimals elsewhere too.
Related to #126, might as well use two decimals throughout.
2021-09-02 01:18:47 +02:00
D. Berge
040c1ead96 Show azimuth to two decimal places.
In planner report template.

Closes #126.
2021-09-02 01:17:40 +02:00
D. Berge
1c7bed0c15 Fix returning next planned sequence number.
If no sequences have been shot, return 1 instead of null as the
next available sequence number.

Fixes #125.
2021-09-02 01:04:38 +02:00
D. Berge
dfcda1b2d9 Merge branch '103-24-hour-lookahead-planning-report' into 'devel'
Resolve "24-hour lookahead planning report"

Closes #103

See merge request wgp/dougal/software!13
2021-06-21 14:53:35 +00:00
D. Berge
b3aadfc33c Merge branch '60-update-planner-as-sequences-are-shot' into 'devel'
Resolve "Update planner as sequences are shot"

Closes #60

See merge request wgp/dougal/software!12
2021-06-21 14:52:11 +00:00
D. Berge
d5980d9154 Add CSV planner output option 2021-06-19 19:04:05 +02:00
D. Berge
b5f2945c8b Fix end time in plan HTML template 2021-06-19 15:43:04 +02:00
D. Berge
9bbffe2ae0 React to changes in planner remarks 2021-06-19 12:27:36 +02:00
D. Berge
09f60d6c18 Add database upgrade file 06 2021-06-19 12:23:25 +02:00
D. Berge
81d9ea19cc Add adjust_planner() function to DB schema.
It updates the planned lines details according to production and current
time.
2021-06-19 12:18:28 +02:00
D. Berge
497d4d68f9 Call notify on changes to schema's info table 2021-06-19 12:17:26 +02:00
D. Berge
853deca3c3 Rename misnamed trigger 2021-06-19 12:16:37 +02:00
D. Berge
99f1530db3 Replace phone icon in template.
Strangely enough, the emoji icon seems to work reliably across
platforms.
2021-05-31 02:54:38 +02:00
D. Berge
b325ae3452 Let the user know when there are no planner comments 2021-05-31 02:47:20 +02:00
D. Berge
f97d334fe5 Improve the aesthetics of the planner remarks section 2021-05-31 02:41:58 +02:00
D. Berge
cb114f01cd Add GUI support for downloading planner data.
Including HTML and PDF formats, which constitutes the lookahead report.
2021-05-31 02:29:50 +02:00
D. Berge
707df76b70 Add GUI support for saving planner remarks.
They get saved to `/project/:project/info/plan/remarks`.
2021-05-31 02:29:50 +02:00
D. Berge
bba050032f Add POST, PUT, DELETE support to /project/:project/info.
It reuses the same backend functions as for the global `/info/` path.
2021-05-31 02:29:50 +02:00
D. Berge
594233c965 Add HTML & PDF planner output options.
Coupled with a suitable Nunjucks template, this is effectively the
24-hour (or whatever period of time) lookahead.
2021-05-31 02:29:50 +02:00
D. Berge
5795c1f87d Add server-side map rendering component.
Based on our own fork of leaflet-headless.
2021-05-31 02:29:50 +02:00
D. Berge
ccd1852f65 Add Nunjucks rendered get filter.
Given an argument consisting of an array of objects and an attribute
name `attr`, it returns an array of all `attr` attributes.
2021-05-31 02:29:50 +02:00
D. Berge
17947df168 Modify Nunjucks rendered timestamp function.
* It accepts a `precision` parameter which truncates the timestamp to a
give precision. Can be `seconds`, `minutes`, `hours` or `days` / `date`.

* It tries to be more flexible in what it accepts as input.

* It accepts an input of "now" which returns the current timestamp. Can
  be used along with `precision`.
2021-05-31 02:29:50 +02:00
D. Berge
041878096d Accept a mime query parameter to force MIME type 2021-05-31 02:29:50 +02:00
D. Berge
ea3e31058f Refactor the planned lines editing logic.
We move most of the logic from the client (as it was until now) to the
server.

The PATCH command maintains the same format but it should provide only
one of the following keys per request:

* ts0
* ts1
* speed
* fsp
* lsp
* lagAfter
* sequence

   Earlier keys in the list above take priority over latter ones.

The following keys may be provided by themselves or in combination with
each other (but not with any of the above):

* name
* remarks
* meta

As a special case, an empty string as the `name` value causes the name
to be auto-generated.

See comments in the code `patch.js` for details on the update logic.
2021-05-28 20:30:59 +02:00
D. Berge
534a54ef75 Add database upgrade file 05 2021-05-28 20:30:59 +02:00
D. Berge
f314536daf Change planned_lines trigger from statement to row.
Because a) it tells us what has changed and b) doesn't fire if we
didn't actually change anything.
2021-05-28 20:30:59 +02:00
D. Berge
de4aa52417 Make planned_lines primary key deferrable.
Helps when we need to renumber sequences.
2021-05-28 20:30:59 +02:00
D. Berge
758b13b189 Add saillines layer to map 2021-05-28 20:30:29 +02:00
D. Berge
967db1dec6 Include NTBA status in preplot GIS output 2021-05-28 20:29:57 +02:00
D. Berge
91fd5e4559 Ensure that timestamp is always a Date object 2021-05-27 17:50:01 +02:00
D. Berge
cf171628cd Fix error in editing of planned line start time 2021-05-27 17:49:32 +02:00
D. Berge
94c29f4723 Change the sunset / sunrise times reported via the tooltip.
The icon still uses the lower edge of the sun to calculate day / night,
but the tooltip shows actual sunrise and sunset times.
2021-05-27 02:08:30 +02:00
D. Berge
14b2e55a2e Remove edit controls from planner for read-only users.
Left over from #108.
2021-05-27 01:32:03 +02:00
D. Berge
c30e54a515 Round vessel speeds to 0.1 kt 2021-05-27 01:09:28 +02:00
D. Berge
7ead826677 Show sunrise / sunset times in the planner.
* A ‘sun’ icon is shown when a line starts and ends in daytime
* A ‘moon’ icon is shown when a line starts and ends in nighttime
* A ‘sun/moon’ icon is shown in other cases

Sunrise and sunset times are provided as a tooltip when hovering over
the icon.

Closes #72.
2021-05-27 01:02:42 +02:00
D. Berge
7aecb514db Clear QC metadata when importing gun data.
Fixes #118.
2021-05-26 00:30:58 +02:00
D. Berge
ad395aa6e4 Include the planned lines table in system dumps 2021-05-26 00:15:09 +02:00
D. Berge
523ec937dd Always merge metadata on import.
The INSERT INTO raw_lines / final_lines will not always be executed as
the lines may already exist (particularly in raw_lines because of
*online*), so whether it worked or not we merge the metadata immediately
afterwards (this may cause an extra notification to be fired).
2021-05-25 03:19:42 +02:00
D. Berge
9d2ccd75dd Do not try to use line name if there isn't one 2021-05-25 03:19:00 +02:00
D. Berge
3985a6226b Suggest ${lineName}-NavLog.${extension} as file name.
This is for the usual case where only one sequence is requested.

When more than one sequence is requested, the suggested name comes out
as ${projectId}-${sequenceList}.${extension}, where `sequenceList` is
the list of sequence numbers separated by semicolons, e.g.:
eq21203-37;38;39.html.

Closes #116.
2021-05-25 02:23:41 +02:00
D. Berge
7d354ffdb6 Add database upgrade file 2021-05-25 02:21:11 +02:00
D. Berge
3d70a460ac Output raw and final lines metadata in summary views 2021-05-25 02:13:50 +02:00
D. Berge
caae656aae Fix event detection failure.
There was a typo in the channel detection logic, resulting
in bogus events full of `undefined` data values.

Fixes #115.
2021-05-24 18:30:53 +02:00
D. Berge
5708ed1a11 Merge branch '57-make-event-log-entries-for-start-and-end-of-line-upon-import-of-final-sequence-if-the-entries-do' into 'devel'
Resolve "Make event log entries for start and end of line upon import of final sequence, if the entries do not already exist"

Closes #57

See merge request wgp/dougal/software!11
2021-05-24 15:44:58 +00:00
D. Berge
ad3998d4c6 Add database upgrade file 2021-05-24 17:41:11 +02:00
D. Berge
8638f42e6d Add database upgrade files.
These files contain the sequence of SQL commands needed to bring
a database or project schema up to date with the latest template
database or project schema.

These files must be applied manually. Check the comments at the top of
the file for instructions.
2021-05-24 17:39:01 +02:00
D. Berge
bc5aef5144 Run post-import functions after final lines.
The reason why need to do it like this instead of relying on a trigger
is because the entry in final_lines is created first and the final_shots
are populated. If we first the trigger on final_lines it is not going
to find any shots; if we fire it as a row trigger on final_shots it
would try to label every point in sequence as it is imported; finally if
we fire it as a statement trigger on final_shots we have no idea which
sequence was imported.
2021-05-24 16:59:56 +02:00
D. Berge
2b798c3ea3 Ignore attempts to put the same label twice on the same event 2021-05-24 16:59:20 +02:00
D. Berge
4d97784829 Upgrade database project schema template.
Adds:

* label_in_sequence (_sequence integer, _label text):
  Returns events containing the specified label.

* handle_final_line_events (_seq integer, _label text, _column text):
  - If _label does not exist in the events for sequence _seq:
    it adds a new _label label at the shotpoint obtained from
    final_lines_summary[_column].
  - If _label does exist (and hasn't been auto-added by this function
    in a previous run), it will add information about it to the final
    line's metadata.

* final_line_post_import (_seq integer):
  Calls handle_final_line_events() on the given sequence to check
  for FSP, FGSP, LGSP and LSP labels.

* events_seq_labels_single ():
  Trigger function to ensure that labels that have the attribute
  `model.multiple` set to `false` occur at most only once per
  sequence. If a new instance is added to a sequence, the previous
  instance is deleted.

* Trigger on events_seq_labels that calls events_seq_labels_single().

* Trigger on events_timed_labels that calls events_seq_labels_single().
2021-05-24 16:49:39 +02:00
D. Berge
13da38b4cd Make websocket notifications await.
Not sure if this helps much. It might help with avoiding
out of order notifications and reducing the rate at which
the clients get spammed when importing database dumps and
such, but that hasn't been tested.
2021-05-24 15:52:29 +02:00
D. Berge
5af89050fb Refactor SOL/EOL real-time detection handler.
This also implements a generic handler mechanism that can be
reused for other purposes, such as sending email / XMPP notifications,
doing real-time QC checks and so on.

Fixes #113.
2021-05-24 13:48:53 +02:00
D. Berge
d40ceb8343 Refactor list of notification channels into its own file 2021-05-24 13:38:19 +02:00
D. Berge
56d1279584 Allow api action to make arbitrary HTTP(S) requests.
If the URL is an absolute HTTP(S) one, we use it as-is.
2021-05-24 13:35:36 +02:00
D. Berge
d02edb4e76 Force the argument into String prior to splitting 2021-05-24 13:32:03 +02:00
D. Berge
9875ae86f3 Record P1/11 line name in database on import 2021-05-24 13:30:25 +02:00
D. Berge
53f71f7005 Set primary key on events_seq_labels in schema template 2021-05-23 22:27:00 +02:00
D. Berge
5de64e6b45 Add meta column to events view in schema template 2021-05-23 22:26:00 +02:00
D. Berge
67af85eca9 Recognise PENDING status in sequence imports.
If a final sequence file or directory name matches a pattern
which is recognised to indicate a ‘pending acceptance’ status,
the final data (if any exists) for that sequence will be deleted
and a comment added to the effect that the sequence has been
marked as ‘pending’.

To accept the sequence, rename its final file or directory name
accordingly.

Note: it is the *final* data that is searched for a matching
pattern, not the raw.

Closes #91.
2021-05-21 15:15:15 +02:00
D. Berge
779b28a331 Add info table to system dumps 2021-05-21 12:18:36 +02:00
D. Berge
b9a4d18ed9 Do not fail if no equipment has been defined.
Fixes #112.
2021-05-20 21:16:39 +02:00
D. Berge
0dc9ac2b3c Merge branch '71-add-equipment-info-to-the-logs' into 'devel'
Resolve "Add equipment info to the logs"

Closes #71

See merge request wgp/dougal/software!10
2021-05-20 19:05:35 +00:00
D. Berge
39d85a692b Use default Nunjucks template if necessary.
If the survey configuration does not itself have a template
we will use the one in etc/defaults/templates/sequence.html.njk.

It is not very likely that the template will be changed all that
often and it avoids issues when people forget to copy it across
to a new survey, etc.
2021-05-20 20:38:39 +02:00
D. Berge
e7661bfd1c Do not fail if requested object does not exist 2021-05-20 20:38:08 +02:00
D. Berge
1649de6c68 Update default sequence HTML template 2021-05-20 20:37:37 +02:00
D. Berge
1089d1fe75 Add equipment configuration fontend user interface 2021-05-20 18:35:56 +02:00
D. Berge
fc58a4d435 Implement equipment frontend component 2021-05-20 18:35:56 +02:00
D. Berge
c832d8b107 Commit default template for sequences 2021-05-20 18:35:56 +02:00
D. Berge
4a9e61be78 Add unique filter to Nunjucks renderer 2021-05-20 18:35:56 +02:00
D. Berge
8cfd1a7fc9 Export equipment info to Seis+JSON files 2021-05-20 18:35:56 +02:00
D. Berge
315733eec0 Refactor events export middleware.
Uses the `prepare` method for better reusability.
2021-05-20 18:35:56 +02:00
D. Berge
ad422abe94 Add prepare method for Seis+JSON and related exports.
It retrieves the data necessary for a complete Seis+JSON
export, including equipment info.
2021-05-20 18:35:56 +02:00
D. Berge
92210378e1 Listen for and broadcast info notifications 2021-05-20 18:21:01 +02:00
D. Berge
8d3e665206 Expose new API endpoint: /info/:path(*).
Provides CRUD access to values (which may be deeply nested) from the
global `info` table.
2021-05-20 18:19:29 +02:00
D. Berge
4ee65ef284 Implement info/delete middleware 2021-05-20 18:18:26 +02:00
D. Berge
d048a19066 Implement info/put middleware 2021-05-20 18:18:13 +02:00
D. Berge
97ed9bcce4 Implement info/post middleware 2021-05-20 18:17:52 +02:00
D. Berge
316117cb83 Implement info.delete() database method.
It deletes a (possibly deeply nested) element in the
`info` table.
2021-05-20 18:16:26 +02:00
D. Berge
1d38f6526b Implement info.put() database method.
Replaces an existing element with a new one, or inserts it
if there is nothing to replace. The element may be deeply
nested inside a JSON object or array in the `info` table.

Works for both public.info and survey_?.info.
2021-05-20 18:14:43 +02:00
D. Berge
6feb7d49ee Implement info.post() database method.
It adds an element to a JSON array corresponding to a
key in the info table. Errors out if the value is not
an array.
2021-05-20 18:13:15 +02:00
D. Berge
ac51f72180 Ignore empty path parts in info.get() 2021-05-20 18:10:51 +02:00
D. Berge
86d3323869 Remove logging statement 2021-05-20 18:10:27 +02:00
D. Berge
b181e4f424 Let the user set the search path to no survey.
This is so that we can access tables in the `public`
schema which are overloaded by survey tables, as is
the case with `info`.
2021-05-20 18:08:03 +02:00
D. Berge
7917eeeb0b Add table info to schema.
This one is independent of any projects so it goes
into `public`.
2021-05-20 18:07:05 +02:00
D. Berge
b18907fb05 Merge branch '53-mark-points-as-not-to-be-acquired-ntba' into 'devel'
Resolve "Mark points as ‘not to be acquired’ (NTBA)"

Closes #53

See merge request wgp/dougal/software!9
2021-05-17 18:34:46 +00:00
D. Berge
3e1861fcf6 Update API description 2021-05-17 20:30:59 +02:00
D. Berge
820b0c2b91 Add set line complete / incomplete actions.
The following options are shown:

* Set line complete:

If a line has been partially shot and still has points
to be acquired.

This option marks remaining virgin points as NTBA=true.

* Set line incomplete:

If a line has been partially shot and remaining virgin
points have been marked as NTBA.

This option marks all points in the line as NTBA=false.

* Set line NTBA:

If a line has not been (successfully) shot at all, i.e.,
all points on the line are virgin.

This option marks the line itself as NTBA=true.

* Unset line NTBA:

If a line has been marked as NTBA.

This option clears the NTBA flag from the line.
2021-05-17 20:19:53 +02:00
D. Berge
57f4834da8 Add information about virgin and remaining points 2021-05-17 20:19:16 +02:00
D. Berge
08d33e293a React also on preplot point changes, not just lines 2021-05-17 20:18:33 +02:00
D. Berge
8e71b18225 Add complete to line PATCH options.
`complete` is a boolean.

If true, any virgin points remaining on the line
will be marked as `ntba=true`.

If false, *all* points on the line will be marked
as `ntba=false`.
2021-05-17 20:15:34 +02:00
D. Berge
f297458954 Report on virgin points and points to be acquired.
Virgin points are those that have not been acquired
(and processed) at least once.

Points to be acquired are virgin points that do not
have the `ntba` flag set.
2021-05-17 20:13:53 +02:00
D. Berge
eb28648e57 Remove bogus dependency 2021-05-17 17:18:35 +02:00
D. Berge
0c352512b0 Enable the ‘view on map’ log action item. 2021-05-17 17:14:58 +02:00
D. Berge
4d87506720 Show a map marker if position given in URL hash.
If the location URL contains a hash of either:

* #z/x/y
* #x/y

In the first case it will zoom and pan to the location;
in the second case it will only pan while maintaining the
current (or last used) zoom level.

If the location URL does not contain a hash in one of those
formats, the marker will be removed from the map.
2021-05-17 17:14:35 +02:00
D. Berge
20bce40dac Upgrade Vue components 2021-05-17 14:22:26 +02:00
D. Berge
cf79cf86ae Fix ‘this is undefined’ error 2021-05-16 21:38:31 +02:00
D. Berge
8e4f62e5be Reset snack message when hiding.
This is so that the same message will cause the snack
to be shown again.
2021-05-16 19:58:36 +02:00
D. Berge
a8850e5d0c Protect the /project/:project/meta route 2021-05-16 19:58:03 +02:00
D. Berge
b5a762b5e3 Merge branch '108-remove-edit-controls-for-read-only-users' into 'devel'
Resolve "Remove edit controls for read-only users"

Closes #108

See merge request wgp/dougal/software!8
2021-05-16 17:56:35 +00:00
D. Berge
418f1a00b8 Hide edit controls from ready-only users 2021-05-16 19:55:31 +02:00
D. Berge
0d9f7ac4ec Add privilege level getters to Vuex.
* writeaccess: true if user can change data.
* adminaccess: true if user is an administrator.
2021-05-16 19:53:24 +02:00
D. Berge
76c9c3ef2a Assign (some) offline navdata to a survey.
There is no concept of ‘current survey’ in Dougal, and
assigning navigation data to a particular survey is full
of edge cases but sometimes it is necessary or at least
convenient to do so.

This commit implements once such strategy, which consists
of checking the distance to the preplots of all active
surveys (well, those that do have preplots anyway) and
picking the nearest one.

To reduce load, we only do this every once in a while as
governed by the `offline_survey_detect_interval` option
in the configuration.

This strategy is only active if the configuration option
`offline_survey_heuristics == "nearest_preplot"` for the
corresponding navigation header.
2021-05-16 03:16:19 +02:00
D. Berge
ef798860cd Add collect filter to template renderer.
This filter can collect attributes from items having the
same key into a single item.

Can be used in templates like this:

{% for Entry in Sequence.Entries |
   collect("ShotPointId", ["EntryType", "Comment"]) %}

to avoid duplicating shotpoint numbers.
2021-05-15 20:07:02 +02:00
D. Berge
e57c362d94 Fix error with timestamp filter (again) 2021-05-15 20:06:36 +02:00
D. Berge
7605b11fdb Fix error with timestamp Nunjucks filter 2021-05-15 18:59:47 +02:00
D. Berge
84e791fc66 Add more sequence information to SeisJSON file 2021-05-15 18:37:32 +02:00
D. Berge
3e2126cc32 Add option to download reports from sequence list.
The context menu includes options to download the sequence
report in different formats.
2021-05-15 17:12:41 +02:00
D. Berge
b0f4559b83 Allow direct downloading of sequence reports.
If the `download` or `d` query parameter is supplied (even
without any value), the response will include a
`Content-Disposition: attachment` header. A filename will
also be suggested.
2021-05-15 17:10:28 +02:00
D. Berge
c7e2e18cc8 Merge branch '84-produce-human-readable-versions-of-json-structured-sequence-data-exports-sse' into 'devel'
Resolve "Produce human-readable versions of JSON structured sequence data exports (SSE)"

Closes #84

See merge request wgp/dougal/software!7
2021-05-15 13:07:07 +00:00
D. Berge
42697fe91d Provide a default replacement for @POS@ markers 2021-05-15 01:57:46 +02:00
D. Berge
900d7f7a3e Ensure that a geometry exists 2021-05-15 01:57:46 +02:00
D. Berge
f1953807db Add position filters to Vue.
Given some text and an item containing a Point geometry,
the `position` filter replaces occurences of @POS@ or
@POSITION@ with the item's geometry (it has to be lat/lon).

Occurrences of @DMS@ are replaced with the position in
sexagesimal degrees.

This can be used anywhere a Vue filter can. However, we
have used it in the event comments edit dialogue. The positions
are replaced before saving the comment to the database.
2021-05-15 01:57:46 +02:00
D. Berge
814e071698 Add Markdown support to map tooltips 2021-05-15 01:57:46 +02:00
D. Berge
2aba132220 Add Markdown support to preplot lines comments 2021-05-15 01:57:46 +02:00
D. Berge
15a802227d Add Markdown support to planned lines comments 2021-05-15 01:57:46 +02:00
D. Berge
6745757712 Add Markdown support to log comments 2021-05-15 01:57:45 +02:00
D. Berge
9ff76867c9 Add Markdown support to sequence list comments 2021-05-15 01:57:45 +02:00
D. Berge
e8811560de Add global .markdown class.
It changes textareas to be monospaced.
2021-05-15 01:57:45 +02:00
D. Berge
65b33a6b0f Add Vue Markdown filters.
{{ '**strong** _em_' |markdown }} gives:
<p><strong>strong</strong> <em>em</em></p>

{{ '**strong** _em_' |markdownInline }} gives:
<strong>strong</strong> <em>em</em>
2021-05-15 01:57:45 +02:00
D. Berge
b8b5765b46 Split markdown Nunjucks filter into two new ones.
{{ '**strong** _em_' |markdown }} gives:
<p><strong>strong</strong> <em>em</em></p>

{{ '**strong** _em_' |markdownInline }} gives:
<strong>strong</strong> <em>em</em>
2021-05-15 01:57:45 +02:00
D. Berge
53f4e167f8 Update ‘marked’ version on server 2021-05-15 01:57:45 +02:00
D. Berge
3d8f524d4a Expose PDF output option in user interface 2021-05-15 01:57:45 +02:00
D. Berge
1e68676ac6 Add PDF output option for events log 2021-05-15 01:57:45 +02:00
D. Berge
2c2d594877 Add Selenium webdriver to backend.
Used for generating PDFs via a Firefox instance.
2021-05-15 01:57:45 +02:00
D. Berge
fae849aeab Send specific error message if HTML template not found 2021-05-15 01:57:45 +02:00
D. Berge
1d47495799 Adapt log view controls to small viewports 2021-05-15 01:57:45 +02:00
D. Berge
592632d669 Add timestamp filter to renderer 2021-05-15 01:57:45 +02:00
D. Berge
26c05b9e3c Add Markdown support to template renderer 2021-05-15 01:57:45 +02:00
D. Berge
3f9a40724d Add download menu to sequence logs.
The menu lets the user retrieve a sequence's events
in a variety of formats:

* JSON
* Seis+JSON
* GeoJSON
* HTML
2021-05-15 01:57:45 +02:00
D. Berge
a652a08815 Add GET endpoint for sequence events.
Provides a variety of formats:

* JSON
* Seis+JSON
* GeoJSON
* HTML
2021-05-15 01:57:45 +02:00
D. Berge
61ffd1b766 Refactor the function producing Seis+JSON into its own file.
For reuse.
2021-05-15 01:57:45 +02:00
D. Berge
d9f4583224 Implement GET middleware for events.
Produces a choice of outputs: JSON, GeoJSON, Seis+JSON and HTML.
2021-05-15 01:57:45 +02:00
D. Berge
95647337aa Add Nunjucks renderer.
The render function takes a JSON file and a Nunjucks
template and outputs a rendered version of the JSON
data according to the template.
2021-05-15 01:57:45 +02:00
D. Berge
b1e152179e Add new command: insert_event.py
Used to insert a timed event in the log.
2021-05-15 01:56:49 +02:00
D. Berge
142a820ed7 Process comment markers server-side.
Replace @POS@, @POSITION@ and @DMS@ in the remarks
with the event's position (sexagesimal degrees for
the last one).
2021-05-15 01:54:07 +02:00
D. Berge
838b45ef26 Do not fail if some data files are missing 2021-05-15 01:51:55 +02:00
D. Berge
30914b267a Set the right Content-Type for error outputs 2021-05-13 21:48:46 +02:00
D. Berge
f1cbbdb56b Check if raw P1/11 has records.
Even if it hasn't, the file gets imported anyway
(into the `files` table) but we exit early to avoid
an error when trying to determine shooting direction.

This check is not necessary for final P1/11 or gun data.

Fixes #104.
2021-05-12 20:35:51 +02:00
D. Berge
9973e8f132 Merge branch '94-let-users-assign-a-colour-to-preplot-lines' into 'devel'
Resolve "Let users assign a colour to preplot lines"

Closes #94

See merge request wgp/dougal/software!6
2021-05-09 22:41:51 +00:00
D. Berge
f53c479262 Add option to assign colours to preplot lines.
A ‘Set colour…’ option is available from the context menu;
it presents a dialogue allowing the user to choose a colour
that will be assigned to that preplot line and used as the
background colour for the corresponding row on the table
(may also be used for other things).

Because there is a good chance that the user may decide to
colour a large number of lines and it is cumbersome to do
it one at a time, a multiple selection option has also been
added. The context menu then shows options which will apply
to all selected rows. At this time only the change colour
option is available, but it can be extended easily.
2021-05-10 00:22:57 +02:00
D. Berge
73a415a038 Return preplot metadata via the API 2021-05-10 00:21:56 +02:00
D. Berge
0b24e3224f Let calendar toolbar follow theme.
Fixes #80.
2021-05-09 21:23:34 +02:00
D. Berge
c271256015 Remember map view settings.
Save the layer and overlay selection + map zoom and
position per user per project in the browser's
localStorage.

Closes #96.
2021-05-09 15:29:17 +02:00
D. Berge
4887ddaa26 Do not update page location on map change.
Fixes #77.
See also #96.
2021-05-09 03:52:25 +02:00
D. Berge
788c582f98 Show planned sequences in status field of preplots table.
Closes #86.
2021-05-09 00:20:02 +02:00
D. Berge
df9f7f33cf Retrieve user data for LineList.
Fixes a bug in commit fd2e0399f8.
2021-05-09 00:16:08 +02:00
D. Berge
fd2e0399f8 Remember last applied number of table rows.
A hopefully sensible default is applied, but if the
user changes it, the last selected value is saved
in the browsers localStorage.

Preferences are saved per user, project and table. And
per browser, of course, as those are only saved locally.

Closes #41.
2021-05-08 21:54:55 +02:00
D. Berge
db733ceef8 Add key to feed items 2021-05-08 21:54:06 +02:00
D. Berge
f905eb3fdf Upgrade Vuetify to latest version 2021-05-08 21:53:31 +02:00
D. Berge
e707887702 Change colour of planned lines on map.
As magenta is already used for the real-time track.
2021-05-08 20:36:19 +02:00
D. Berge
c0ace1fe07 Make check mark green if non-leaf QC item has no children.
If a test passes for all items, show the (single) check mark
and colour it green.

Leaf nodes always have their check mark in the default colour.

Related to #90.
2021-05-08 04:08:37 +02:00
D. Berge
7bb3a3910b Show development activity log.
A button in the help dialogue takes the user to the
/feed/… frontend URL, where the latest development
activity is shown, taken from the GitLab RSS feed
for the project.
2021-05-08 00:46:31 +02:00
D. Berge
983113b6cc Add flag to api action to fetch non-JSON data.
If {text:true} or another truthy value is passed as the
`text` option, the api action will use Response.text()
instead of Response.json().
2021-05-08 00:44:05 +02:00
D. Berge
ff66c9a88d Handle planner sequence value for first line in prospect.
The next sequence to shoot is normally retrieved from the
database via getSequence(), but it returns false if no
sequences have been shot yet.

In that case we use a default value of `1` to build the
name of the planned line.

Fixes #81
Fixes #82
2021-05-08 00:20:15 +02:00
D. Berge
56d30d48c5 Adapt help dialogue to small viewports 2021-05-07 23:52:36 +02:00
D. Berge
df3a0b4c50 Be explicit about what type of data is being QC'ed.
The source deviation QCs now tell the user whether raw
or final data is being QC'ed.
2021-05-07 21:29:39 +02:00
D. Berge
f87aa08246 Check if gun data missing for entire line.
The `sequences` object now carries the attribute
`has_smsrc_data`, a boolean which is true iff
there is at least one `smsrc` record in the raw
shots metadata.

This is used by:

1. A new sequence-wise test which reports if gun
   data is missing for the entire sequence.

2. The individual `missing_gun_data` test which
   is inhibited if `has_smsrc_data` for the
   corresponding sequence is false.

Closes #93.
2021-05-07 14:04:48 +02:00
D. Berge
ea499a645b Update package dependencies 2021-05-07 14:04:12 +02:00
D. Berge
0fdb42c593 Do not import files that have just been modified.
We now check that a file is at least a few seconds old
before attempting to import it.

The actual minimum age can be configured in etc/config.yaml or
else is defaults to 10 seconds.

The idea is that this should give the OS enough time to fully
write the file before we import it.

The timestamp being looked at is the modification time.

Fixes #92.
2021-05-07 13:50:32 +02:00
D. Berge
6e5584a433 Make the QC double-tick green if all items accepted.
Closes #90.
2021-05-07 13:38:26 +02:00
D. Berge
0a4df0793d Update package dependencies 2021-05-07 13:37:44 +02:00
D. Berge
1e6cc67b05 Merge branch '63-serve-api-specification' into 'devel'
Resolve "Serve API specification"

Closes #63

See merge request wgp/dougal/software!5
2020-12-30 08:45:55 +00:00
D. Berge
3c4a558e02 Serve OpenAPI document on API root.
When a client makes a request for `/` (the root of
the API), the OpenAPI description is served in an
appropriate format according to the `Accept` request
header, as follows:

Accept: text/html => HTML version
Accept: application/json => JSON version
Accept: * => YAML version
2020-12-29 16:20:57 +01:00
D. Berge
76001cffe1 Create HTML version of OpenAPI doc on install.
When running `npm install`, a self-contained HTML document
with the contents of the OpenAPI specification is saved as
openapi.html in the same directory as openapi.yaml.
2020-12-29 16:18:53 +01:00
D. Berge
45a9c5aa07 Document login and logout endpoints 2020-10-23 17:28:41 +02:00
D. Berge
f926184471 Add label descriptions to API spec 2020-10-23 15:14:52 +02:00
D. Berge
5ffd3712cf Merge branch '61-user-authentication' into devel 2020-10-23 15:14:09 +02:00
D. Berge
80451796e1 Convert expiry time to milliseconds for set-cookie 2020-10-23 14:59:45 +02:00
D. Berge
141d5805ae Reissue user login tokens when close to expiring 2020-10-23 14:50:35 +02:00
D. Berge
250ffe243d Fix JWT token time to live.
Now half an hour.
2020-10-23 14:49:52 +02:00
D. Berge
b4decd018a Add API documentation 2020-10-23 11:09:08 +02:00
D. Berge
46d489c91f Fix metadata retrieval from preplots 2020-10-23 11:01:38 +02:00
D. Berge
8a0bcc5cb4 Change HTTP response status from 201 to 204 2020-10-23 11:00:56 +02:00
D. Berge
77258b12e9 Merge branch '62-service-desk-from-ss-om-magseisfairfield-com-bug-report' into 'devel'
Resolve "Service Desk (from ss.om@magseisfairfield.com): Bug report"

Closes #62

See merge request wgp/dougal/software!4
2020-10-15 17:20:30 +00:00
D. Berge
59aaacbeee Apply access restrictions to writable routes 2020-10-12 19:43:07 +02:00
D. Berge
3c86981dc6 Add authorisation middleware.
Defines three levels of access:
* read: anyone who is logged in
* write: `user` and `admin` roles
* admin: `admin` roles
2020-10-12 19:42:02 +02:00
D. Berge
5594b6863c Do not run authentication if headers already sent 2020-10-12 19:41:00 +02:00
D. Berge
7201c29df5 Inject auth middleware after login routes.
Routes not requiring authentication must,
self-evidently, go before the authentication
middleware.
2020-10-11 22:11:36 +02:00
D. Berge
947736e8c1 Check code rather than errno.
Different versions of that library work
differently.
2020-10-11 22:10:21 +02:00
D. Berge
d782a30e90 Avoid decoding empty cookies 2020-10-11 19:59:28 +02:00
D. Berge
987dbb7700 Handle null/invalid cookies 2020-10-11 19:36:11 +02:00
D. Berge
cdd007ce88 Fix authentification middleware 2020-10-11 19:08:36 +02:00
D. Berge
a38066ec82 Set cookie / user to null if failing to decode JWT 2020-10-11 19:06:57 +02:00
D. Berge
2aca34e488 Read user login info from discrete file.
`$DOUGAL_ROOT/etc/users.yaml` to be exact.
2020-10-11 18:21:19 +02:00
D. Berge
324306a77d Remove logging statement 2020-10-11 18:20:41 +02:00
D. Berge
ab8a66bdcf Set JWT default options 2020-10-11 17:58:41 +02:00
D. Berge
b3f393a6f1 Make navigation bar user control functional.
Shows whether the user is logged in and presents
appropriate options according to whether this is
a manual or automatic login (a manual login is
when the user explicitly logs in with a user name
and password).
2020-10-11 17:57:00 +02:00
D. Berge
1ee886db63 Add login/logout views to frontend 2020-10-11 17:56:32 +02:00
D. Berge
fc9450434c Read credentials from cookie store when loading app 2020-10-11 17:55:17 +02:00
D. Berge
00f4fcf292 Read credentials from API responses 2020-10-11 17:54:34 +02:00
D. Berge
0512ac2c3c Add user module to Vuex store 2020-10-11 17:53:39 +02:00
D. Berge
dd32982cbe Add login/logout middleware 2020-10-11 17:52:13 +02:00
D. Berge
a3bfb73937 Add authentication middleware.
The user is authenticated by one of the following
methods, in order of priority:

* The presence of a valid JWT.
* Its IP.
* Its hostname.

In the case of the latter two methods, if authentication
is successful a JWT valid for 15 minutes will be generated
and passed back to the user in a cookie.
2020-10-11 13:11:43 +02:00
399 changed files with 56033 additions and 6702 deletions

3
.gitignore vendored
View File

@@ -10,3 +10,6 @@ lib/www/client/source/dist/
lib/www/client/dist/
etc/surveys/*.yaml
!etc/surveys/_*.yaml
etc/ssl/*
etc/config.yaml
var/*

27
bin/check_mounts_present.py Executable file
View File

@@ -0,0 +1,27 @@
#!/usr/bin/python3
"""
Check if any of the directories provided in the imports.mounts configuration
section are empty.
Returns 0 if all arguments are non-empty, 1 otherwise. It stops at the first
empty directory.
"""
import os
import configuration
cfg = configuration.read()
if cfg and "imports" in cfg and "mounts" in cfg["imports"]:
mounts = cfg["imports"]["mounts"]
for item in mounts:
with os.scandir(item) as contents:
if not any(contents):
exit(1)
else:
print("No mounts in configuration")
exit(0)

View File

@@ -1,4 +1,5 @@
import os
import pathlib
from glob import glob
from yaml import full_load as _load
@@ -11,6 +12,18 @@ surveys should be under $HOME/etc/surveys/*.yaml. In both cases,
$HOME is the home directory of the user running this script.
"""
def is_relative_to(it, other):
"""
is_relative_to() is not present version Python 3.9, so we
need this kludge to get Dougal to run on OpenSUSE 15.4
"""
if "is_relative_to" in dir(it):
return it.is_relative_to(other)
return str(it.absolute()).startswith(str(other.absolute()))
prefix = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
DOUGAL_ROOT = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
@@ -54,6 +67,10 @@ def files (globspec = None, include_archived = False):
quickly and temporarily “disabling” a survey configuration by renaming
the relevant file.
"""
print("This method is obsolete")
return
tuples = []
if globspec is None:
@@ -87,3 +104,73 @@ def rxflags (flagstr):
for flag in flagstr:
flags |= cases.get(flag, 0)
return flags
def translate_path (file):
"""
Translate a path from a Dougal import directory to an actual
physical path on disk.
Any user files accessible by Dougal must be under a path prefixed
by `(config.yaml).imports.paths`. The value of `imports.paths` may
be either a string, in which case this represents the prefix under
which all Dougal data resides, or a dictionary where the keys are
logical paths and their values the corresponding physical path.
"""
cfg = read()
root = pathlib.Path(DOUGAL_ROOT)
filepath = pathlib.Path(file).resolve()
import_paths = cfg["imports"]["paths"]
if filepath.is_absolute():
if type(import_paths) == str:
# Substitute the root for the real physical path
# NOTE: `root` deals with import_paths not being absolute
prefix = root.joinpath(pathlib.Path(import_paths)).resolve()
return str(pathlib.Path(prefix).joinpath(*filepath.parts[2:]))
else:
# Look for a match on the second path element
if filepath.parts[1] in import_paths:
# NOTE: `root` deals with import_paths[…] not being absolute
prefix = root.joinpath(import_paths[filepath.parts[1]])
return str(pathlib.Path(prefix).joinpath(*filepath.parts[2:]))
else:
# This path is invalid
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
# A relative filepath is always resolved relative to the logical root
root = pathlib.Path("/")
return translate_path(root.joinpath(filepath))
def untranslate_path (file):
"""
Attempt to convert a physical path into a logical one.
See `translate_path()` above for details.
"""
cfg = read()
dougal_root = pathlib.Path(DOUGAL_ROOT)
filepath = pathlib.Path(file).resolve()
import_paths = cfg["imports"]["paths"]
physical_root = pathlib.Path("/")
if filepath.is_absolute():
if type(import_paths) == str:
if is_relative_to(filepath, import_paths):
physical_root = pathlib.Path("/")
physical_prefix = pathlib.Path(import_paths)
return str(root.joinpath(filepath.relative_to(physical_prefix)))
else:
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
for key, value in import_paths.items():
value = dougal_root.joinpath(value)
physical_prefix = pathlib.Path(value)
if is_relative_to(filepath, physical_prefix):
logical_prefix = physical_root.joinpath(pathlib.Path(key)).resolve()
return str(logical_prefix.joinpath(filepath.relative_to(physical_prefix)))
# If we got here with no matches, this is not a valid
# Dougal data path
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
# A relative filepath is always resolved relative to DOUGAL_ROOT
return untranslate_path(root.joinpath(filepath))

View File

@@ -10,7 +10,7 @@
# be known to the database.
# * PROJECT_NAME is a more descriptive name for human consumption.
# * EPSG_CODE is the EPSG code identifying the CRS for the grid data in the
# navigation files, e.g., 32031.
# navigation files, e.g., 23031.
#
# In addition to this, certain other parameters may be controlled via
# environment variables:

26
bin/daily_tasks.py Executable file
View File

@@ -0,0 +1,26 @@
#!/usr/bin/python3
"""
Do daily housekeeping on the database.
This is meant to run shortly after midnight every day.
"""
import configuration
from datastore import Datastore
if __name__ == '__main__':
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
print("Daily tasks")
db.run_daily_tasks()
print("Done")

View File

@@ -4,6 +4,7 @@ import psycopg2
import configuration
import preplots
import p111
from hashlib import md5 # Because it's good enough
"""
Interface to the PostgreSQL database.
@@ -11,13 +12,16 @@ Interface to the PostgreSQL database.
def file_hash(file):
"""
Calculate a file hash based on its size, inode, modification and creation times.
Calculate a file hash based on its name, size, modification and creation times.
The hash is used to uniquely identify files in the database and detect if they
have changed.
"""
h = md5()
h.update(file.encode())
name_digest = h.hexdigest()[:16]
st = os.stat(file)
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, st.st_ino]])
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, name_digest]])
class Datastore:
"""
@@ -48,7 +52,7 @@ class Datastore:
self.conn = psycopg2.connect(configuration.read()["db"]["connection_string"], **opts)
def set_autocommit(value = True):
def set_autocommit(self, value = True):
"""
Enable or disable autocommit.
@@ -91,7 +95,7 @@ class Datastore:
cursor.execute(qry, (filepath,))
results = cursor.fetchall()
if len(results):
return (filepath, file_hash(filepath)) in results
return (filepath, file_hash(configuration.translate_path(filepath))) in results
def add_file(self, path, cursor = None):
@@ -103,7 +107,8 @@ class Datastore:
else:
cur = cursor
hash = file_hash(path)
realpath = configuration.translate_path(path)
hash = file_hash(realpath)
qry = "CALL add_file(%s, %s);"
cur.execute(qry, (path, hash))
if cursor is None:
@@ -172,7 +177,7 @@ class Datastore:
else:
cur = cursor
hash = file_hash(path)
hash = file_hash(configuration.translate_path(path))
qry = """
UPDATE raw_lines rl
SET ntbp = %s
@@ -390,20 +395,40 @@ class Datastore:
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
if not records or len(records) == 0:
print("File has no records (or none have been detected)")
# We add the file to the database anyway to signal that we have
# actually seen it.
self.maybe_commit()
return
incr = p111.point_number(records[0]) <= p111.point_number(records[-1])
# Start by deleting any online data we may have for this sequence
self.del_hash("*online*", cursor)
qry = """
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr)
VALUES (%s, %s, '', %s, %s)
ON CONFLICT DO NOTHING;
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr, meta)
VALUES (%s, %s, '', %s, %s, %s)
ON CONFLICT (sequence) DO UPDATE SET
line = EXCLUDED.line,
ntbp = EXCLUDED.ntbp,
incr = EXCLUDED.incr,
meta = EXCLUDED.meta;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr, json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO raw_lines_files (sequence, hash)
@@ -436,16 +461,26 @@ class Datastore:
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
qry = """
INSERT INTO final_lines (sequence, line, remarks)
VALUES (%s, %s, '')
ON CONFLICT DO NOTHING;
INSERT INTO final_lines (sequence, line, remarks, meta)
VALUES (%s, %s, '', %s)
ON CONFLICT (sequence) DO UPDATE SET
line = EXCLUDED.line,
meta = EXCLUDED.meta;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"]))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO final_lines_files (sequence, hash)
@@ -472,6 +507,8 @@ class Datastore:
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
cursor.execute("CALL final_line_post_import(%s);", (fileinfo["sequence"],))
self.maybe_commit()
def save_raw_smsrc (self, records, fileinfo, filepath, filedata = None):
@@ -506,7 +543,7 @@ class Datastore:
qry = """
UPDATE raw_shots
SET meta = jsonb_set(meta, '{smsrc}', %s::jsonb, true)
SET meta = jsonb_set(meta, '{smsrc}', %s::jsonb, true) - 'qc'
WHERE sequence = %s AND point = %s;
"""
@@ -552,7 +589,63 @@ class Datastore:
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def get_file_data(self, path, cursor = None):
"""
Retrieve arbitrary data associated with a file.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
realpath = configuration.translate_path(path)
hash = file_hash(realpath)
qry = """
SELECT data
FROM file_data
WHERE hash = %s;
"""
cur.execute(qry, (hash,))
res = cur.fetchone()
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
return res[0]
def surveys (self, include_archived = False):
"""
Return list of survey definitions.
"""
if self.conn is None:
self.connect()
if include_archived:
qry = """
SELECT meta
FROM public.projects;
"""
else:
qry = """
SELECT meta
FROM public.projects
WHERE NOT (meta->'archived')::boolean IS true
"""
with self.conn:
with self.conn.cursor() as cursor:
cursor.execute(qry)
results = cursor.fetchall()
return [r[0] for r in results if r[0]]
# TODO Does this need tweaking on account of #246?
def apply_survey_configuration(self, cursor = None):
if cursor is None:
cur = self.conn.cursor()
@@ -631,3 +724,73 @@ class Datastore:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def del_sequence_final(self, sequence, cursor = None):
"""
Remove final data for a sequence.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "DELETE FROM files WHERE hash = (SELECT hash FROM final_lines_files WHERE sequence = %s);"
cur.execute(qry, (sequence,))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def adjust_planner(self, cursor = None):
"""
Adjust estimated times on the planner
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL adjust_planner();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def housekeep_event_log(self, cursor = None):
"""
Call housekeeping actions on the event log
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL augment_event_data();"
cur.execute(qry)
qry = "CALL scan_placeholders();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def run_daily_tasks(self, cursor = None):
"""
Run once-a-day tasks
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL log_midnight_shots();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction

26
bin/housekeep_database.py Executable file
View File

@@ -0,0 +1,26 @@
#!/usr/bin/python3
"""
Do housekeeping actions on the database.
"""
import configuration
from datastore import Datastore
if __name__ == '__main__':
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
print("Planner adjustment")
db.adjust_planner()
print("Event log housekeeping")
db.housekeep_event_log()
print("Done")

View File

@@ -59,7 +59,7 @@ def qc_data (cursor, prefix):
else:
print("No QC data found");
return
#print("QC", qc)
index = 0
for item in qc["results"]:

View File

@@ -39,7 +39,7 @@ def seis_data (survey):
if not pathlib.Path(pathPrefix).exists():
print(pathPrefix)
raise ValueError("Export path does not exist")
print(f"Requesting sequences for {survey['id']}")
url = f"http://localhost:3000/api/project/{survey['id']}/sequence"
r = requests.get(url)
@@ -47,12 +47,12 @@ def seis_data (survey):
for sequence in r.json():
if sequence['status'] not in ["final", "ntbp"]:
continue
filename = pathlib.Path(pathPrefix, "sequence{:0>3d}.json".format(sequence['sequence']))
if filename.exists():
print(f"Skipping export for sequence {sequence['sequence']} file already exists")
continue
print(f"Processing sequence {sequence['sequence']}")
url = f"http://localhost:3000/api/project/{survey['id']}/event?sequence={sequence['sequence']}&missing=t"
headers = { "Accept": "application/vnd.seis+json" }

View File

@@ -12,18 +12,50 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p111
from datastore import Datastore
def add_pending_remark(db, sequence):
text = '<!-- @@DGL:PENDING@@ --><h4 style="color:red;cursor:help;" title="Edit the sequence file or directory name to import final data">Marked as <code>PENDING</code>.</h4><!-- @@/DGL:PENDING@@ -->\n'
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
remarks = cursor.fetchone()[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is None:
remarks = text + remarks
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
def del_pending_remark(db, sequence):
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
row = cursor.fetchone()
if row is not None:
remarks = row[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is not None:
remarks = rx.sub("",remarks)
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -40,35 +72,64 @@ if __name__ == '__main__':
pattern = final_p111["pattern"]
rx = re.compile(pattern["regex"])
if "pending" in survey["final"]:
pendingRx = re.compile(survey["final"]["pending"]["pattern"]["regex"])
for fileprefix in final_p111["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in final_p111["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
pending = False
if pendingRx:
pending = pendingRx.search(physical_filepath) is not None
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
continue
if not db.file_in_db(filepath):
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not match the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not match the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
p111_data = p111.from_file(filepath)
if pending:
print("Skipping / removing final file because marked as PENDING", logical_filepath)
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
continue
else:
del_pending_remark(db, file_info["sequence"])
p111_data = p111.from_file(physical_filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_final_p111(p111_records, file_info, filepath, survey["epsg"])
db.save_final_p111(p111_records, file_info, logical_filepath, survey["epsg"])
else:
print("Already in DB")
if pending:
print("Removing from database because marked as PENDING")
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
print("Done")

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p190
from datastore import Datastore
@@ -20,6 +21,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -49,6 +51,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

127
bin/import_map_layers.py Executable file
View File

@@ -0,0 +1,127 @@
#!/usr/bin/python3
"""
Import SmartSource data.
For each survey in configuration.surveys(), check for new
or modified final gun header files and (re-)import them into the
database.
"""
import os
import sys
import pathlib
import re
import time
import json
import configuration
from datastore import Datastore
if __name__ == '__main__':
"""
Imports map layers from the directories defined in the configuration object
`import.map.layers`. The content of that key is an object with the following
structure:
{
layer1Name: [
format: "geojson",
path: "", // Logical path to a directory
globs: [
"**/*.geojson", // List of globs matching map data files
]
],
layer2Name: …
}
"""
def process (layer_name, layer, physical_filepath):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
return
print("Importing")
file_info = {
"type": "map_layer",
"format": layer["format"],
"name": layer_name,
"tooltip": layer.get("tooltip"),
"popup": layer.get("popup")
}
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
file_info = db.get_file_data(logical_filepath)
dirty = False
if file_info:
if file_info["name"] != layer_name:
print("Renaming to", layer_name)
file_info["name"] = layer_name
dirty = True
if file_info.get("tooltip") != layer.get("tooltip"):
print("Changing tooltip to", layer.get("tooltip") or "null")
file_info["tooltip"] = layer.get("tooltip")
dirty = True
if file_info.get("popup") != layer.get("popup"):
print("Changing popup to", layer.get("popup") or "null")
file_info["popup"] = layer.get("popup")
dirty = True
if dirty:
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
print("Already in DB")
print("Reading configuration")
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
try:
map_layers = survey["imports"]["map"]["layers"]
except KeyError:
print("No map layers defined")
continue
for layer_name, layer_items in map_layers.items():
for layer in layer_items:
fileprefix = layer["path"]
realprefix = configuration.translate_path(fileprefix)
if os.path.isfile(realprefix):
process(layer_name, layer, realprefix)
elif os.path.isdir(realprefix):
if not "globs" in layer:
layer["globs"] = [ "**/*.geojson" ]
for globspec in layer["globs"]:
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
process(layer_name, layer, physical_filepath)
print("Done")

View File

@@ -8,29 +8,40 @@ or modified preplots and (re-)import them into the database.
"""
from glob import glob
import os
import sys
import time
import configuration
import preplots
from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading configuration")
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
for file in survey["preplots"]:
realpath = configuration.translate_path(file["path"])
print(f"Preplot: {file['path']}")
if not db.file_in_db(file["path"]):
age = time.time() - os.path.getmtime(realpath)
if age < file_min_age:
print("Skipping file because too new", file["path"])
continue
print("Importing")
try:
preplot = preplots.from_file(file)
preplot = preplots.from_file(file, realpath)
except FileNotFoundError:
print(f"File does not exist: {file['path']}", file=sys.stderr)
continue

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p111
from datastore import Datastore
@@ -19,11 +20,11 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -44,43 +45,56 @@ if __name__ == '__main__':
ntbpRx = re.compile(survey["raw"]["ntbp"]["pattern"]["regex"])
for fileprefix in raw_p111["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in raw_p111["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if ntbpRx:
ntbp = ntbpRx.search(filepath) is not None
ntbp = ntbpRx.search(physical_filepath) is not None
else:
ntbp = False
if not db.file_in_db(filepath):
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not match the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not matching the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
p111_data = p111.from_file(filepath)
p111_data = p111.from_file(physical_filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
if len(p111_records):
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
db.save_raw_p111(p111_records, file_info, logical_filepath, survey["epsg"], ntbp=ntbp)
else:
print("No source records found in file")
else:
print("Already in DB")
# Update the NTBP status to whatever the latest is,
# as it might have changed.
db.set_ntbp(filepath, ntbp)
db.set_ntbp(logical_filepath, ntbp)
if ntbp:
print("Sequence is NTBP")

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p190
from datastore import Datastore
@@ -20,6 +21,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -52,6 +54,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import smsrc
from datastore import Datastore
@@ -19,11 +20,11 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -45,30 +46,38 @@ if __name__ == '__main__':
rx = re.compile(pattern["regex"], flags)
for fileprefix in raw_smsrc["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in raw_smsrc["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
continue
if not db.file_in_db(filepath):
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not matching the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not matching the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
file_info = dict(zip(pattern["captures"], match.groups()))
smsrc_records = smsrc.from_file(filepath)
smsrc_records = smsrc.from_file(physical_filepath)
print("Saving")
db.save_raw_smsrc(smsrc_records, file_info, filepath)
db.save_raw_smsrc(smsrc_records, file_info, logical_filepath)
else:
print("Already in DB")

View File

@@ -15,25 +15,4 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
configs = configuration.files(include_archived = True)
print("Connecting to database")
db = Datastore()
#db.connect()
print("Reading surveys")
for config in configs:
filepath = config[0]
survey = config[1]
print(f'Survey: {survey["id"]} ({filepath})')
db.set_survey(survey["schema"])
if not db.file_in_db(filepath):
print("Saving to DB")
db.save_file_data(filepath, json.dumps(survey))
print("Applying survey configuration")
db.apply_survey_configuration()
else:
print("Already in DB")
print("Done")
print("This function is obsolete. Returning with no action")

48
bin/insert_event.py Executable file
View File

@@ -0,0 +1,48 @@
#!/usr/bin/python3
from datetime import datetime
from datastore import Datastore
def detect_schema (conn):
with conn.cursor() as cursor:
qry = "SELECT meta->>'_schema' AS schema, tstamp, age(current_timestamp, tstamp) age FROM real_time_inputs WHERE meta ? '_schema' AND age(current_timestamp, tstamp) < '02:00:00' ORDER BY tstamp DESC LIMIT 1"
cursor.execute(qry)
res = cursor.fetchone()
if res and len(res):
return res[0]
return None
if __name__ == '__main__':
import argparse
ap = argparse.ArgumentParser()
ap.add_argument("-s", "--schema", required=False, default=None, help="survey where to insert the event")
ap.add_argument("-t", "--tstamp", required=False, default=None, help="event timestamp")
ap.add_argument("-l", "--label", required=False, default=None, action="append", help="event label")
ap.add_argument('remarks', type=str, nargs="+", help="event message")
args = vars(ap.parse_args())
db = Datastore()
db.connect()
if args["schema"]:
schema = args["schema"]
else:
schema = detect_schema(db.conn)
if args["tstamp"]:
tstamp = args["tstamp"]
else:
tstamp = datetime.utcnow().isoformat()
message = " ".join(args["remarks"])
print("new event:", schema, tstamp, message, args["label"])
if schema and tstamp and message:
db.set_survey(schema)
with db.conn.cursor() as cursor:
qry = "INSERT INTO event_log (tstamp, remarks, labels) VALUES (%s, replace_placeholders(%s, %s, NULL, NULL), %s);"
cursor.execute(qry, (tstamp, message, tstamp, args["label"]))
db.maybe_commit()

View File

@@ -153,6 +153,9 @@ def parse_line (string):
return None
def line_name(records):
return set([ r['Acquisition Line Name'] for r in p111_type("S", records) ]).pop()
def p111_type(type, records):
return [ r for r in records if r["type"] == type ]

View File

@@ -12,7 +12,7 @@ from parse_fwr import parse_fwr
def parse_p190_header (string):
"""Parse a generic P1/90 header record.
Returns a dictionary of fields.
"""
names = [ "record_type", "header_type", "header_type_modifier", "description", "data" ]
@@ -27,7 +27,7 @@ def parse_p190_type1 (string):
"doy", "time", "spare2" ]
record = parse_fwr(string, [1, 12, 3, 1, 1, 1, 6, 10, 11, 9, 9, 6, 3, 6, 1])
return dict(zip(names, record))
def parse_p190_rcv_group (string):
"""Parse a P1/90 Type 1 receiver group record."""
names = [ "record_type",
@@ -37,7 +37,7 @@ def parse_p190_rcv_group (string):
"streamer_id" ]
record = parse_fwr(string, [1, 4, 9, 9, 4, 4, 9, 9, 4, 4, 9, 9, 4, 1])
return dict(zip(names, record))
def parse_line (string):
type = string[0]
if string[:3] == "EOF":
@@ -52,7 +52,7 @@ def parse_line (string):
def p190_type(type, records):
return [ r for r in records if r["record_type"] == type ]
def p190_header(code, records):
return [ h for h in p190_type("H", records) if h["header_type"]+h["header_type_modifier"] == code ]
@@ -86,15 +86,15 @@ def normalise_record(record):
# These are probably strings
elif "strip" in dir(record[key]):
record[key] = record[key].strip()
return record
def normalise(records):
for record in records:
normalise_record(record)
return records
def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records = []
with open(path) as fd:
@@ -102,10 +102,10 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
line = fd.readline()
while line:
cnt = cnt + 1
if line == "EOF":
break
record = parse_line(line)
if record is not None:
if only_records:
@@ -121,9 +121,9 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records.append(record)
line = fd.readline()
return records
def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
#print("tstamp", tstamp, type(tstamp))
if type(tstamp) is int:
@@ -161,16 +161,16 @@ def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
record["tstamp"] = ts
prev[object_id(record)] = doy
break
return recordset
def dms(value):
# 591544.61N
hemisphere = 1 if value[-1] in "NnEe" else -1
seconds = float(value[-6:-1])
minutes = int(value[-8:-6])
degrees = int(value[:-8])
return (degrees + minutes/60 + seconds/3600) * hemisphere
def tod(record):
@@ -183,7 +183,7 @@ def tod(record):
m = int(time[2:4])
s = float(time[4:])
return d*86400 + h*3600 + m*60 + s
def duration(record0, record1):
ts0 = tod(record0)
ts1 = tod(record1)
@@ -198,10 +198,10 @@ def azimuth(record0, record1):
x0, y0 = float(record0["easting"]), float(record0["northing"])
x1, y1 = float(record1["easting"]), float(record1["northing"])
return math.degrees(math.atan2(x1-x0, y1-y0)) % 360
def speed(record0, record1, knots=False):
scale = 3600/1852 if knots else 1
t0 = tod(record0)
t1 = tod(record1)
return (distance(record0, record1) / math.fabs(t1-t0)) * scale

View File

@@ -4,9 +4,10 @@ import sps
Preplot importing functions.
"""
def from_file (file):
def from_file (file, realpath = None):
filepath = realpath or file["path"]
if not "type" in file or file["type"] == "sps":
records = sps.from_file(file["path"], file["format"] if "format" in file else None )
records = sps.from_file(filepath, file["format"] if "format" in file else None )
else:
return "Not an SPS file"

View File

@@ -13,21 +13,27 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
print("Reading configuration")
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
for file in db.list_files():
path = file[0]
if not os.path.exists(path):
print(path, "NOT FOUND")
db.del_file(path)
try:
path = configuration.translate_path(file[0])
if not os.path.exists(path):
print(path, "NOT FOUND")
db.del_file(file[0])
except TypeError:
# In case the logical path no longer matches
# the Dougal configuration.
print(file[0], "COULD NOT BE TRANSLATED TO A PHYSICAL PATH. DELETING")
db.del_file(file[0])
print("Done")

View File

@@ -1,5 +1,6 @@
#!/bin/bash
DOUGAL_ROOT=${DOUGAL_ROOT:-$(dirname "$0")/..}
BINDIR="$DOUGAL_ROOT/bin"
@@ -8,6 +9,20 @@ LOCKFILE=${LOCKFILE:-$VARDIR/runner.lock}
[ -f ~/.profile ] && . ~/.profile
DOUGAL_LOG_TAG="dougal.runner[$$]"
# Only send output to the logger if we have the appropriate
# configuration set.
if [[ -n "$DOUGAL_LOG_TAG" && -n "$DOUGAL_LOG_FACILITY" ]]; then
function _logger () {
logger $*
}
else
function _logger () {
: # This is the Bash null command
}
fi
function tstamp () {
date -u +%Y-%m-%dT%H:%M:%SZ
}
@@ -18,26 +33,44 @@ function prefix () {
function print_log () {
printf "$(prefix)\033[36m%s\033[0m\n" "$*"
_logger -t "$DOUGAL_LOG_TAG" -p "$DOUGAL_LOG_FACILITY.info" "$*"
}
function print_info () {
printf "$(prefix)\033[0m%s\n" "$*"
_logger -t "$DOUGAL_LOG_TAG" -p "$DOUGAL_LOG_FACILITY.debug" "$*"
}
function print_warning () {
printf "$(prefix)\033[33;1m%s\033[0m\n" "$*"
_logger -t "$DOUGAL_LOG_TAG" -p "$DOUGAL_LOG_FACILITY.warning" "$*"
}
function print_error () {
printf "$(prefix)\033[31m%s\033[0m\n" "$*"
_logger -t "$DOUGAL_LOG_TAG" -p "$DOUGAL_LOG_FACILITY.error" "$*"
}
function run () {
PROGNAME=$(basename "$1")
PROGNAME=${PROGNAME:-$(basename "$1")}
STDOUTLOG="$VARDIR/$PROGNAME.out"
STDERRLOG="$VARDIR/$PROGNAME.err"
"$1" >"$STDOUTLOG" 2>"$STDERRLOG" || {
# What follows runs the command that we have been given (with any arguments passed)
# and logs:
# * stdout to $STDOUTLOG (a temporary file) and possibly to syslog, if enabled.
# * stderr to $STDERRLOG (a temporary file) and possibly to syslog, if enabled.
#
# When logging to syslog, stdout goes as debug level and stderr as warning (not error)
#
# The temporary file is used in case the command fails, at which point we try to log
# a warning in GitLab's alerts facility.
$* \
> >(tee $STDOUTLOG |_logger -t "dougal.runner.$PROGNAME[$$]" -p "$DOUGAL_LOG_FACILITY.debug") \
2> >(tee $STDERRLOG |_logger -t "dougal.runner.$PROGNAME[$$]" -p "$DOUGAL_LOG_FACILITY.warning") || {
print_error "Failed: $PROGNAME"
cat $STDOUTLOG
cat $STDERRLOG
@@ -52,11 +85,17 @@ function run () {
exit 2
}
# cat $STDOUTLOG
unset PROGNAME
rm $STDOUTLOG $STDERRLOG
}
function cleanup () {
if [[ -f $LOCKFILE ]]; then
rm "$LOCKFILE"
fi
}
if [[ -f $LOCKFILE ]]; then
PID=$(cat "$LOCKFILE")
if pgrep -F "$LOCKFILE"; then
@@ -74,6 +113,13 @@ echo "$$" > "$LOCKFILE" || {
}
print_info "Start run"
print_log "Check if data is accessible"
$BINDIR/check_mounts_present.py || {
print_warning "Import mounts not accessible. Inhibiting all tasks!"
cleanup
exit 253
}
print_log "Purge deleted files"
run $BINDIR/purge_deleted_files.py
@@ -86,33 +132,47 @@ run $BINDIR/import_preplots.py
print_log "Import raw P1/11"
run $BINDIR/import_raw_p111.py
print_log "Import raw P1/90"
run $BINDIR/import_raw_p190.py
#print_log "Import raw P1/90"
#run $BINDIR/import_raw_p190.py
print_log "Import final P1/11"
run $BINDIR/import_final_p111.py
print_log "Import final P1/90"
run $BINDIR/import_final_p190.py
#print_log "Import final P1/90"
#run $BINDIR/import_final_p190.py
print_log "Import SmartSource data"
run $BINDIR/import_smsrc.py
if [[ -z "$RUNNER_NOEXPORT" ]]; then
print_log "Export system data"
run $BINDIR/system_exports.py
fi
print_log "Import map user layers"
run $BINDIR/import_map_layers.py
if [[ -n "$RUNNER_IMPORT" ]]; then
print_log "Import system data"
run $BINDIR/system_imports.py
fi
# if [[ -z "$RUNNER_NOEXPORT" ]]; then
# print_log "Export system data"
# run $BINDIR/system_exports.py
# fi
print_log "Export QC data"
run $BINDIR/human_exports_qc.py
# if [[ -n "$RUNNER_IMPORT" ]]; then
# print_log "Import system data"
# run $BINDIR/system_imports.py
# fi
print_log "Export sequence data"
run $BINDIR/human_exports_seis.py
# print_log "Export QC data"
# run $BINDIR/human_exports_qc.py
# print_log "Export sequence data"
# run $BINDIR/human_exports_seis.py
print_log "Process ASAQC queue"
# Run insecure in test mode:
# export NODE_TLS_REJECT_UNAUTHORIZED=0
PROGNAME=asaqc_queue run $DOUGAL_ROOT/lib/www/server/queues/asaqc/index.js
print_log "Run database housekeeping actions"
run $BINDIR/housekeep_database.py
print_log "Run QCs"
PROGNAME=run_qc run $DOUGAL_ROOT/lib/www/server/lib/qc/index.js
rm "$LOCKFILE"

View File

@@ -24,6 +24,7 @@ locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
@@ -32,12 +33,13 @@ exportables = {
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ]
"raw_shots": [ "meta" ],
"planned_lines": None
}
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
@@ -48,7 +50,7 @@ def primary_key (table, cursor):
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()

View File

@@ -40,6 +40,10 @@ if __name__ == '__main__':
continue
try:
for table in exportables:
path = os.path.join(pathPrefix, table)
if os.path.exists(path):
cursor.execute(f"DELETE FROM {table};")
for table in exportables:
path = os.path.join(pathPrefix, table)
print("", path, "", table)

View File

@@ -19,6 +19,7 @@ locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
@@ -27,12 +28,13 @@ exportables = {
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ]
"raw_shots": [ "meta" ],
"planned_lines": None
}
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
@@ -43,13 +45,13 @@ def primary_key (table, cursor):
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()
def import_table(fd, table, columns, cursor):
pk = [ r[0] for r in primary_key(table, cursor) ]
# Create temporary table to import into
temptable = "import_"+table
print("Creating temporary table", temptable)
@@ -59,29 +61,29 @@ def import_table(fd, table, columns, cursor):
AS SELECT {', '.join(pk + columns)} FROM {table}
WITH NO DATA;
"""
#print(qry)
cursor.execute(qry)
# Import into the temp table
print("Import data into temporary table")
cursor.copy_from(fd, temptable)
# Update the destination table
print("Updating destination table")
setcols = ", ".join([ f"{c} = t.{c}" for c in columns ])
wherecols = " AND ".join([ f"{table}.{c} = t.{c}" for c in pk ])
qry = f"""
UPDATE {table}
SET {setcols}
FROM {temptable} t
WHERE {wherecols};
"""
#print(qry)
cursor.execute(qry)
if __name__ == '__main__':
@@ -96,16 +98,21 @@ if __name__ == '__main__':
with db.conn.cursor() as cursor:
columns = exportables["public"][table]
path = os.path.join(VARDIR, "-"+table)
with open(path, "rb") as fd:
print(" →→ ", path, " ←← ", table, columns)
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
try:
with open(path, "rb") as fd:
print(" →→ ", path, " ←← ", table, columns)
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
db.conn.commit()
print("Reading surveys")
for survey in surveys:
@@ -123,17 +130,20 @@ if __name__ == '__main__':
columns = exportables["survey"][table]
path = os.path.join(pathPrefix, "-"+table)
print(" ←← ", path, " →→ ", table, columns)
with open(path, "rb") as fd:
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
try:
with open(path, "rb") as fd:
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
# If we don't commit the data does not actually get copied
db.conn.commit()

65
etc/config.example.yaml Normal file
View File

@@ -0,0 +1,65 @@
db:
connection_string: "host=localhost port=5432 dbname=dougal user=postgres"
webhooks:
alert:
url: https://gitlab.com/wgp/dougal/software/alerts/notify.json
authkey: ""
# The authorisation key can be provided here or read from the
# environment variable GITLAB_ALERTS_AUTHKEY. The environment
# variable has precedence. It can be saved under the user's
# Bash .profile. This is the recommended way to avoid accidentally
# committing a security token into the git repository.
navigation:
headers:
-
type: udp
port: 30000
meta:
# Anything here gets passed as options to the packet
# saving routine.
epsg: 23031 # Assume this CRS for unqualified E/N data
# Heuristics to apply to detect survey when offline
offline_survey_heuristics: "nearest_preplot"
# Apply the heuristics at most once every…
offline_survey_detect_interval: 10000 # ms
imports:
# For a file to be imported, it must have been last modified at
# least this many seconds ago.
file_min_age: 60
# These paths refer to remote mounts which must be present in order
# for imports to work. If any of these paths are empty, import actions
# (including data deletion) will be inhibited. This is to cope with
# things like transient network failures.
mounts:
- /srv/mnt/Data
# These paths can be exposed to end users via the API. They should
# contain the locations were project data, or any other user data
# that needs to be accessible by Dougal, is located.
#
# This key can be either a string or an object:
# - If a string, it points to the root path for Dougal-accessible data.
# - If an object, there is an implicit root and the first-level
# paths are denoted by the keys, with the values being their
# respective physical paths.
# Non-absolute paths are relative to $DOUGAL_ROOT.
paths: /srv/mnt/Data
queues:
asaqc:
request:
url: "https://api.gateway.equinor.com/vt/v1/api/upload-file-encoded"
args:
method: POST
headers:
Content-Type: application/json
httpsAgent: # The paths here are relative to $DOUGAL_ROOT
cert: etc/ssl/asaqc.crt
key: etc/ssl/asaqc.key

View File

@@ -1,24 +0,0 @@
db:
connection_string: "host=localhost port=5432 dbname=dougal user=postgres"
webhooks:
alert:
url: https://gitlab.com/wgp/dougal/software/alerts/notify.json
authkey: ""
# The authorisation key can be provided here or read from the
# environment variable GITLAB_ALERTS_AUTHKEY. The environment
# variable has precedence. It can be saved under the user's
# Bash .profile. This is the recommended way to avoid accidentally
# committing a security token into the git repository.
navigation:
headers:
-
type: udp
port: 30000
meta:
# Anything here gets passed as options to the packet
# saving routine.
epsg: 23031 # Assume this CRS for unqualified E/N data

View File

@@ -19,3 +19,124 @@ Created with:
```bash
SCHEMA_NAME=survey_X EPSG_CODE=XXXXX $DOUGAL_ROOT/sbin/dump_schema.sh
```
## To create a new Dougal database
Ensure that the following packages are installed:
* `postgresql*-postgis-utils`
* `postgresql*-postgis`
* `postgresql*-contrib` # For B-trees
```bash
psql -U postgres <./database-template.sql
psql -U postgres <./database-version.sql
```
---
# Upgrading PostgreSQL
The following is based on https://en.opensuse.org/SDB:PostgreSQL#Upgrading_major_PostgreSQL_version
```bash
# The following bash code should be checked and executed
# line for line whenever you do an upgrade. The example
# shows the upgrade process from an original installation
# of version 12 up to version 14.
# install the new server as well as the required postgresql-contrib packages:
zypper in postgresql14-server postgresql14-contrib postgresql12-contrib
# If not yet done, copy the configuration create a new PostgreSQL configuration directory...
mkdir /etc/postgresql
# and copy the original file to this global directory
cd /srv/pgsql/data
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do cp -a $i /etc/postgresql/$i ; done
# Now create a new data-directory and initialize it for usage with the new server
install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
cd /srv/pgsql/data14
sudo -u postgres /usr/lib/postgresql14/bin/initdb .
# replace the newly generated files by a symlink to the global files.
# After doing so, you may check the difference of the created backup files and
# the files from the former installation
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do old $i ; ln -s /etc/postgresql/$i .; done
# Copy over special thesaurus files if some exists.
#cp -a /usr/share/postgresql12/tsearch_data/my_thesaurus_german.ths /usr/share/postgresql14/tsearch_data/
# Now it's time to disable the service...
systemctl stop postgresql.service
# And to start the migration. Please ensure, the directories fit to your upgrade path
sudo -u postgres /usr/lib/postgresql14/bin/pg_upgrade --link \
--old-bindir="/usr/lib/postgresql12/bin" \
--new-bindir="/usr/lib/postgresql14/bin" \
--old-datadir="/srv/pgsql/data/" \
--new-datadir="/srv/pgsql/data14/"
# NOTE: If getting the following error:
# lc_collate values for database "postgres" do not match: old "en_US.UTF-8", new "C"
# then:
# cd ..
# rm -rf /srv/pgsql/data14
# install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
# cd /srv/pgsql/data14
# sudo -u postgres /usr/lib/postgresql14/bin/initdb --locale=en_US.UTF-8 .
#
# and repeat the migration command
# After successfully migrating the data...
cd ..
# if not already symlinked move the old data to a versioned directory matching
# your old installation...
mv data data12
# and set a symlink to the new data directory
ln -sf data14/ data
# Now start the new service
systemctl start postgresql.service
# If everything has been sucessful, you should uninstall old packages...
#zypper rm -u postgresql12 postgresql13
# and remove old data directories
#rm -rf /srv/pgsql/data_OLD_POSTGRES_VERSION_NUMBER
# For good measure:
sudo -u postgres /usr/lib/postgresql14/bin/vacuumdb --all --analyze-in-stages
# If update_extensions.sql exists, apply it.
```
# Restoring from backup
## Whole database
Ensure that nothing is connected to the database.
```bash
psql -U postgres --dbname postgres <<EOF
-- Database: dougal
DROP DATABASE IF EXISTS dougal;
CREATE DATABASE dougal
WITH
OWNER = postgres
ENCODING = 'UTF8'
LC_COLLATE = 'en_GB.UTF-8'
LC_CTYPE = 'en_GB.UTF-8'
TABLESPACE = pg_default
CONNECTION LIMIT = -1;
ALTER DATABASE dougal
SET search_path TO "$user", public, topology;
EOF
# Adjust --jobs according to host machine
pg_restore -U postgres --dbname dougal --clean --if-exists --jobs 32 /path/to/backup
```

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 12.4
-- Dumped by pg_dump version 12.4
-- Dumped from database version 14.2
-- Dumped by pg_dump version 14.2
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -102,20 +102,6 @@ CREATE EXTENSION IF NOT EXISTS postgis WITH SCHEMA public;
COMMENT ON EXTENSION postgis IS 'PostGIS geometry, geography, and raster spatial types and functions';
--
-- Name: postgis_raster; Type: EXTENSION; Schema: -; Owner: -
--
CREATE EXTENSION IF NOT EXISTS postgis_raster WITH SCHEMA public;
--
-- Name: EXTENSION postgis_raster; Type: COMMENT; Schema: -; Owner:
--
COMMENT ON EXTENSION postgis_raster IS 'PostGIS raster types and functions';
--
-- Name: postgis_sfcgal; Type: EXTENSION; Schema: -; Owner: -
--
@@ -144,6 +130,221 @@ CREATE EXTENSION IF NOT EXISTS postgis_topology WITH SCHEMA topology;
COMMENT ON EXTENSION postgis_topology IS 'PostGIS topology spatial types and functions';
--
-- Name: queue_item_status; Type: TYPE; Schema: public; Owner: postgres
--
CREATE TYPE public.queue_item_status AS ENUM (
'queued',
'cancelled',
'failed',
'sent'
);
ALTER TYPE public.queue_item_status OWNER TO postgres;
--
-- Name: event_meta(timestamp with time zone); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.event_meta(tstamp timestamp with time zone) RETURNS jsonb
LANGUAGE plpgsql
AS $$
BEGIN
RETURN event_meta(tstamp, NULL, NULL);
END;
$$;
ALTER FUNCTION public.event_meta(tstamp timestamp with time zone) OWNER TO postgres;
--
-- Name: FUNCTION event_meta(tstamp timestamp with time zone); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.event_meta(tstamp timestamp with time zone) IS 'Overload of event_meta (timestamptz, integer, integer) for use when searching by timestamp.';
--
-- Name: event_meta(integer, integer); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.event_meta(sequence integer, point integer) RETURNS jsonb
LANGUAGE plpgsql
AS $$
BEGIN
RETURN event_meta(NULL, sequence, point);
END;
$$;
ALTER FUNCTION public.event_meta(sequence integer, point integer) OWNER TO postgres;
--
-- Name: FUNCTION event_meta(sequence integer, point integer); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.event_meta(sequence integer, point integer) IS 'Overload of event_meta (timestamptz, integer, integer) for use when searching by sequence / point.';
--
-- Name: event_meta(timestamp with time zone, integer, integer); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.event_meta(tstamp timestamp with time zone, sequence integer, point integer) RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
result jsonb;
-- Tolerance is hard-coded, at least until a need to expose arises.
tolerance numeric;
BEGIN
tolerance := 3; -- seconds
-- We search by timestamp if we can, as that's a lot quicker
IF tstamp IS NOT NULL THEN
SELECT meta
INTO result
FROM real_time_inputs rti
WHERE
rti.tstamp BETWEEN (event_meta.tstamp - tolerance * interval '1 second') AND (event_meta.tstamp + tolerance * interval '1 second')
ORDER BY abs(extract('epoch' FROM rti.tstamp - event_meta.tstamp ))
LIMIT 1;
ELSE
SELECT meta
INTO result
FROM real_time_inputs rti
WHERE
(meta->>'_sequence')::integer = event_meta.sequence AND
(meta->>'_point')::integer = event_meta.point
ORDER BY rti.tstamp DESC
LIMIT 1;
END IF;
RETURN result;
END;
$$;
ALTER FUNCTION public.event_meta(tstamp timestamp with time zone, sequence integer, point integer) OWNER TO postgres;
--
-- Name: FUNCTION event_meta(tstamp timestamp with time zone, sequence integer, point integer); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.event_meta(tstamp timestamp with time zone, sequence integer, point integer) IS 'Return the real-time event metadata associated with a sequence / point in the current project or
with a given timestamp. Timestamp that is first searched for in the shot tables
of the current prospect or, if not found, in the real-time data.
Returns a JSONB object.';
--
-- Name: geometry_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
geometry,
extract('epoch' FROM tstamp - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
tstamp BETWEEN (ts - tolerance * interval '1 second') AND (ts + tolerance * interval '1 second')
ORDER BY abs(extract('epoch' FROM tstamp - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) IS 'Get geometry from timestamp';
--
-- Name: interpolate_geometry_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.interpolate_geometry_from_tstamp(ts timestamp with time zone, maxspan numeric) RETURNS public.geometry
LANGUAGE plpgsql
AS $$
DECLARE
ts0 timestamptz;
ts1 timestamptz;
geom0 geometry;
geom1 geometry;
span numeric;
fraction numeric;
BEGIN
SELECT tstamp, geometry
INTO ts0, geom0
FROM real_time_inputs
WHERE tstamp <= ts
ORDER BY tstamp DESC
LIMIT 1;
SELECT tstamp, geometry
INTO ts1, geom1
FROM real_time_inputs
WHERE tstamp >= ts
ORDER BY tstamp ASC
LIMIT 1;
IF geom0 IS NULL OR geom1 IS NULL THEN
RAISE NOTICE 'Interpolation failed (no straddling data)';
RETURN NULL;
END IF;
-- See if we got an exact match
IF ts0 = ts THEN
RETURN geom0;
ELSIF ts1 = ts THEN
RETURN geom1;
END IF;
span := extract('epoch' FROM ts1 - ts0);
IF span > maxspan THEN
RAISE NOTICE 'Interpolation timespan % outside maximum requested (%)', span, maxspan;
RETURN NULL;
END IF;
fraction := extract('epoch' FROM ts - ts0) / span;
IF fraction < 0 OR fraction > 1 THEN
RAISE NOTICE 'Requested timestamp % outside of interpolation span (fraction: %)', ts, fraction;
RETURN NULL;
END IF;
RETURN ST_LineInterpolatePoint(St_MakeLine(geom0, geom1), fraction);
END;
$$;
ALTER FUNCTION public.interpolate_geometry_from_tstamp(ts timestamp with time zone, maxspan numeric) OWNER TO postgres;
--
-- Name: FUNCTION interpolate_geometry_from_tstamp(ts timestamp with time zone, maxspan numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.interpolate_geometry_from_tstamp(ts timestamp with time zone, maxspan numeric) IS 'Interpolate a position over a given maximum timespan (in seconds)
based on real-time inputs. Returns a POINT geometry.';
--
-- Name: notify(); Type: FUNCTION; Schema: public; Owner: postgres
--
@@ -182,23 +383,110 @@ $$;
ALTER FUNCTION public.notify() OWNER TO postgres;
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
--
-- Name: set_survey(text); Type: PROCEDURE; Schema: public; Owner: postgres
--
CREATE PROCEDURE public.set_survey(project_id text)
CREATE PROCEDURE public.set_survey(IN project_id text)
LANGUAGE sql
AS $$
SELECT set_config('search_path', (SELECT schema||',public' FROM public.projects WHERE pid = lower(project_id)), false);
$$;
ALTER PROCEDURE public.set_survey(project_id text) OWNER TO postgres;
ALTER PROCEDURE public.set_survey(IN project_id text) OWNER TO postgres;
--
-- Name: update_timestamp(); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.update_timestamp() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
IF NEW.updated_on IS NOT NULL THEN
NEW.updated_on := current_timestamp;
END IF;
RETURN NEW;
EXCEPTION
WHEN undefined_column THEN RETURN NEW;
END;
$$;
ALTER FUNCTION public.update_timestamp() OWNER TO postgres;
SET default_tablespace = '';
SET default_table_access_method = heap;
--
-- Name: info; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.info (
key text NOT NULL,
value jsonb
);
ALTER TABLE public.info OWNER TO postgres;
--
-- Name: projects; Type: TABLE; Schema: public; Owner: postgres
--
@@ -213,6 +501,46 @@ CREATE TABLE public.projects (
ALTER TABLE public.projects OWNER TO postgres;
--
-- Name: queue_items; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.queue_items (
item_id integer NOT NULL,
status public.queue_item_status DEFAULT 'queued'::public.queue_item_status NOT NULL,
payload jsonb NOT NULL,
results jsonb DEFAULT '{}'::jsonb NOT NULL,
created_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
updated_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
not_before timestamp with time zone DEFAULT '1970-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
parent_id integer
);
ALTER TABLE public.queue_items OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE; Schema: public; Owner: postgres
--
CREATE SEQUENCE public.queue_items_item_id_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER TABLE public.queue_items_item_id_seq OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: postgres
--
ALTER SEQUENCE public.queue_items_item_id_seq OWNED BY public.queue_items.item_id;
--
-- Name: real_time_inputs; Type: TABLE; Schema: public; Owner: postgres
--
@@ -226,6 +554,21 @@ CREATE TABLE public.real_time_inputs (
ALTER TABLE public.real_time_inputs OWNER TO postgres;
--
-- Name: queue_items item_id; Type: DEFAULT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items ALTER COLUMN item_id SET DEFAULT nextval('public.queue_items_item_id_seq'::regclass);
--
-- Name: info info_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.info
ADD CONSTRAINT info_pkey PRIMARY KEY (key);
--
-- Name: projects projects_name_key; Type: CONSTRAINT; Schema: public; Owner: postgres
--
@@ -250,6 +593,14 @@ ALTER TABLE ONLY public.projects
ADD CONSTRAINT projects_schema_key UNIQUE (schema);
--
-- Name: queue_items queue_items_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_pkey PRIMARY KEY (item_id);
--
-- Name: tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
--
@@ -257,6 +608,13 @@ ALTER TABLE ONLY public.projects
CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
--
-- Name: info info_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
-- Name: projects projects_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -264,6 +622,20 @@ CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects FOR EACH ROW EXECUTE FUNCTION public.notify('project');
--
-- Name: queue_items queue_items_tg0; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg0 BEFORE INSERT OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.update_timestamp();
--
-- Name: queue_items queue_items_tg1; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg1 AFTER INSERT OR DELETE OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.notify('queue_items');
--
-- Name: real_time_inputs real_time_inputs_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -271,6 +643,14 @@ CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects F
CREATE TRIGGER real_time_inputs_tg AFTER INSERT ON public.real_time_inputs FOR EACH ROW EXECUTE FUNCTION public.notify('realtime');
--
-- Name: queue_items queue_items_parent_id_fkey; Type: FK CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_parent_id_fkey FOREIGN KEY (parent_id) REFERENCES public.queue_items(item_id);
--
-- PostgreSQL database dump complete
--

View File

@@ -0,0 +1,5 @@
\connect dougal
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';

File diff suppressed because it is too large Load Diff

34
etc/db/upgrades/README.md Normal file
View File

@@ -0,0 +1,34 @@
# Database schema upgrades
When the database schema needs to be upgraded in order to provide new functionality, fix errors, etc., an upgrade script should be added to this directory.
The script can be SQL (preferred) or anything else (Bash, Python, …) in the event of complex upgrades.
The script itself should:
* document what the intended changes are;
* contain instructions on how to run it;
* make the user aware of any non-obvious side effects; and
* say if it is safe to run the script multiple times on the
* same schema / database.
## Naming
Script files should be named `upgrade-<index>-<commit-id-old>-<commit-id-new>-v<schema-version>.sql`, where:
* `<index>` is a correlative two-digit index. When reaching 99, existing files will be renamed to a three digit index (001-099) and new files will use three digits.
* `<commit-id-old>` is the ID of the Git commit that last introduced a schema change.
* `<commit-id-new>` is the ID of the first Git commit expecting the updated schema.
* `<schema-version>` is the version of the schema.
Note: the `<schema-version>` value should be updated with every change and it should be the same as reported by:
```sql
select value->>'db_schema' as db_schema from public.info where key = 'version';
```
If necessary, the wanted schema version must also be updated in `package.json`.
## Running
Schema upgrades are always run manually.

View File

@@ -0,0 +1,22 @@
-- Upgrade the database from commit 78adb2be to 7917eeeb.
--
-- This upgrade affects the `public` schema only.
--
-- It creates a new table, `info`, for storing arbitrary JSON
-- data not belonging to a specific project. Currently used
-- for the equipment list, it could also serve to store user
-- details, configuration settings, system state, etc.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It will fail harmlessly if applied twice.
CREATE TABLE IF NOT EXISTS public.info (
key text NOT NULL primary key,
value jsonb
);
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');

View File

@@ -0,0 +1,160 @@
-- Upgrade the database from commit 6e7ba82e to 53f71f70.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This merges two changes to the database.
-- The first one (commit 5de64e6b) modifies the `event` view to return
-- the `meta` column of timed and sequence events.
-- The second one (commit 53f71f70) adds a primary key constraint to
-- events_seq_labels (there is already an equivalent constraint on
-- events_seq_timed).
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
DROP VIEW events_seq_timed CASCADE; -- Brings down events too
ALTER TABLE ONLY events_seq_labels
ADD CONSTRAINT events_seq_labels_pkey PRIMARY KEY (id, label);
CREATE OR REPLACE VIEW events_seq_timed AS
SELECT s.sequence,
s.point,
s.id,
s.remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
s.meta,
rs.geometry
FROM (events_seq s
LEFT JOIN raw_shots rs USING (sequence, point));
CREATE OR REPLACE VIEW events AS
WITH qc AS (
SELECT rs.sequence,
rs.point,
ARRAY[jsonb_array_elements_text(q.labels)] AS labels
FROM raw_shots rs,
LATERAL jsonb_path_query(rs.meta, '$."qc".*."labels"'::jsonpath) q(labels)
)
SELECT 'sequence'::text AS type,
false AS virtual,
s.sequence,
s.point,
s.id,
s.remarks,
s.line,
s.objref,
s.tstamp,
s.hash,
s.meta,
(public.st_asgeojson(public.st_transform(s.geometry, 4326)))::jsonb AS geometry,
ARRAY( SELECT esl.label
FROM events_seq_labels esl
WHERE (esl.id = s.id)) AS labels
FROM events_seq_timed s
UNION
SELECT 'timed'::text AS type,
false AS virtual,
rs.sequence,
rs.point,
t.id,
t.remarks,
rs.line,
rs.objref,
t.tstamp,
rs.hash,
t.meta,
(t.meta -> 'geometry'::text) AS geometry,
ARRAY( SELECT etl.label
FROM events_timed_labels etl
WHERE (etl.id = t.id)) AS labels
FROM ((events_timed t
LEFT JOIN events_timed_seq ts USING (id))
LEFT JOIN raw_shots rs USING (sequence, point))
UNION
SELECT 'midnight shot'::text AS type,
true AS virtual,
v1.sequence,
v1.point,
((v1.sequence * 100000) + v1.point) AS id,
''::text AS remarks,
v1.line,
v1.objref,
v1.tstamp,
v1.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(v1.geometry, 4326)))::jsonb AS geometry,
ARRAY[v1.label] AS labels
FROM events_midnight_shot v1
UNION
SELECT 'qc'::text AS type,
true AS virtual,
rs.sequence,
rs.point,
((10000000 + (rs.sequence * 100000)) + rs.point) AS id,
(q.remarks)::text AS remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(rs.geometry, 4326)))::jsonb AS geometry,
('{QC}'::text[] || qc.labels) AS labels
FROM (raw_shots rs
LEFT JOIN qc USING (sequence, point)),
LATERAL jsonb_path_query(rs.meta, '$."qc".*."results"'::jsonpath) q(remarks)
WHERE (rs.meta ? 'qc'::text);
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,171 @@
-- Upgrade the database from commit 53f71f70 to 4d977848.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds:
--
-- * label_in_sequence (_sequence integer, _label text):
-- Returns events containing the specified label.
--
-- * handle_final_line_events (_seq integer, _label text, _column text):
-- - If _label does not exist in the events for sequence _seq:
-- it adds a new _label label at the shotpoint obtained from
-- final_lines_summary[_column].
-- - If _label does exist (and hasn't been auto-added by this function
-- in a previous run), it will add information about it to the final
-- line's metadata.
--
-- * final_line_post_import (_seq integer):
-- Calls handle_final_line_events() on the given sequence to check
-- for FSP, FGSP, LGSP and LSP labels.
--
-- * events_seq_labels_single ():
-- Trigger function to ensure that labels that have the attribute
-- `model.multiple` set to `false` occur at most only once per
-- sequence. If a new instance is added to a sequence, the previous
-- instance is deleted.
--
-- * Trigger on events_seq_labels that calls events_seq_labels_single().
--
-- * Trigger on events_timed_labels that calls events_seq_labels_single().
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS events
LANGUAGE sql
AS $$
SELECT * FROM events WHERE sequence = _sequence AND _label = ANY(labels);
$$;
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event events%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
SELECT id INTO event_id FROM events_seq WHERE sequence = _seq AND point = _column_value ORDER BY id LIMIT 1;
IF event_id IS NULL THEN
--RAISE NOTICE '… but there is no existing event so we create a new one for sequence % and point %', _line.sequence, _column_value;
INSERT INTO events_seq (sequence, point, remarks)
VALUES (_line.sequence, _column_value, format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)))
RETURNING id INTO event_id;
--RAISE NOTICE 'Created event_id %', event_id;
END IF;
--RAISE NOTICE 'Remove any other auto-inserted % labels in sequence %', _label, _seq;
DELETE FROM events_seq_labels
WHERE label = _label AND id = (SELECT id FROM events_seq WHERE sequence = _seq AND meta->'auto' ? _label);
--RAISE NOTICE 'We now add a label to the event (id, label) = (%, %)', event_id, _label;
INSERT INTO events_seq_labels (id, label) VALUES (event_id, _label) ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
--RAISE NOTICE 'And also clear the %: % flag from meta.auto for any existing events for sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE meta->'auto' ? _label AND sequence = _seq AND id <> event_id;
--RAISE NOTICE 'Finally, flag the event as having been had label % auto-created by %', _label, _tg_name;
UPDATE events_seq
SET meta = jsonb_set(jsonb_set(meta, '{auto}', COALESCE(meta->'auto', '{}')), ARRAY['auto', _label], to_jsonb(_tg_name))
WHERE id = event_id;
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
--RAISE NOTICE 'Clearing the %: % flag from meta.auto for any existing events in sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE sequence = _seq AND meta->'auto'->>_label = _tg_name;
END IF;
END IF;
END;
$$;
CREATE OR REPLACE PROCEDURE final_line_post_import (_seq integer)
LANGUAGE plpgsql
AS $$
BEGIN
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$$;
CREATE OR REPLACE FUNCTION events_seq_labels_single ()
RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE _sequence integer;
BEGIN
IF EXISTS(SELECT 1 FROM labels WHERE name = NEW.label AND (data->'model'->'multiple')::boolean IS FALSE) THEN
SELECT sequence INTO _sequence FROM events WHERE id = NEW.id;
DELETE
FROM events_seq_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_seq WHERE sequence = _sequence);
DELETE
FROM events_timed_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_timed_seq WHERE sequence = _sequence);
END IF;
RETURN NULL;
END;
$$;
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_seq_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_timed_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,94 @@
-- Upgrade the database from commit 4d977848 to 3d70a460.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds the `meta` column to the output of the following views:
--
-- * raw_lines_summary; and
-- * sequences_summary
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = rl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_preplots) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
DROP VIEW sequences_summary;
CREATE OR REPLACE VIEW sequences_summary AS
SELECT rls.sequence,
rls.line,
rls.fsp,
rls.lsp,
fls.fsp AS fsp_final,
fls.lsp AS lsp_final,
rls.ts0,
rls.ts1,
fls.ts0 AS ts0_final,
fls.ts1 AS ts1_final,
rls.duration,
fls.duration AS duration_final,
rls.num_preplots,
COALESCE(fls.num_points, rls.num_points) AS num_points,
COALESCE(fls.missing_shots, rls.missing_shots) AS missing_shots,
COALESCE(fls.length, rls.length) AS length,
COALESCE(fls.azimuth, rls.azimuth) AS azimuth,
rls.remarks,
fls.remarks AS remarks_final,
rls.meta,
fls.meta AS meta_final,
CASE
WHEN (rls.ntbp IS TRUE) THEN 'ntbp'::text
WHEN (fls.sequence IS NULL) THEN 'raw'::text
ELSE 'final'::text
END AS status
FROM (raw_lines_summary rls
LEFT JOIN final_lines_summary fls USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,33 @@
-- Upgrade the database from commit 3d70a460 to 0983abac.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This:
--
-- * makes the primary key on planned_lines deferrable; and
-- * changes the planned_lines trigger from statement to row.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
ALTER TABLE planned_lines DROP CONSTRAINT planned_lines_pkey;
ALTER TABLE planned_lines ADD CONSTRAINT planned_lines_pkey PRIMARY KEY (sequence) DEFERRABLE;
DROP TRIGGER planned_lines_tg ON planned_lines;
CREATE TRIGGER planned_lines_tg AFTER INSERT OR DELETE OR UPDATE ON planned_lines FOR EACH ROW EXECUTE FUNCTION public.notify('planned_lines');
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,207 @@
-- Upgrade the database from commit 0983abac to 81d9ea19.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a new procedure adjust_planner() which resolves some
-- conflicts between shot sequences and the planner, such as removing
-- sequences that have been shot, renumbering, or adjusting the planned
-- times.
--
-- It is meant to be called at regular intervals by an external process,
-- such as the runner (software/bin/runner.sh).
--
-- A trigger for changes to the schema's `info` table is also added.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE PROCEDURE adjust_planner ()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT data->'planner'
INTO _planner_config
FROM file_data
WHERE data ? 'planner';
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM events
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
DROP TRIGGER IF EXISTS info_tg ON info;
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,91 @@
-- Upgrade the database from commit 81d9ea19 to 0a10c897.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a new function ij_error(line, point, geometry) which
-- returns the crossline and inline distance (in metres) between the
-- geometry (which must be a point) and the preplot corresponding to
-- line / point.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
-- Return the crossline, inline error of `geom` with respect to `line` and `point`
-- in the project's binning grid.
CREATE OR REPLACE FUNCTION ij_error(line double precision, point double precision, geom public.geometry)
RETURNS public.geometry(Point, 0)
LANGUAGE plpgsql STABLE LEAKPROOF
AS $$
DECLARE
bp jsonb := binning_parameters();
ij public.geometry := to_binning_grid(geom, bp);
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
error_i double precision;
error_j double precision;
BEGIN
error_i := (public.st_x(ij) - line) * I_width;
error_j := (public.st_y(ij) - point) * J_width;
RETURN public.ST_MakePoint(error_i, error_j);
END
$$;
-- Return the list of points and metadata for all sequences.
-- Only points which have a corresponding preplot are returned.
-- If available, final positions are returned as well, if not they
-- are NULL.
-- Likewise, crossline / inline errors are also returned as a PostGIS
-- 2D point both for raw and final data.
CREATE OR REPLACE VIEW sequences_detail AS
SELECT
rl.sequence, rl.line AS sailline,
rs.line, rs.point,
rs.tstamp,
rs.objref objRefRaw, fs.objref objRefFinal,
ST_Transform(pp.geometry, 4326) geometryPreplot,
ST_Transform(rs.geometry, 4326) geometryRaw,
ST_Transform(fs.geometry, 4326) geometryFinal,
ij_error(rs.line, rs.point, rs.geometry) errorRaw,
ij_error(rs.line, rs.point, fs.geometry) errorFinal,
json_build_object('preplot', pp.meta, 'raw', rs.meta, 'final', fs.meta) meta
FROM
raw_lines rl
INNER JOIN raw_shots rs USING (sequence)
INNER JOIN preplot_points pp ON rs.line = pp.line AND rs.point = pp.point
LEFT JOIN final_shots fs ON rl.sequence = fs.sequence AND rs.point = fs.point;
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,75 @@
-- Upgrade the database from commit 81d9ea19 to 74b3de5c.
--
-- This upgrade affects the `public` schema only.
--
-- It creates a new table, `queue_items`, for storing
-- requests and responses related to inter-API communication.
-- At the moment this means Equinor's ASAQC API, but it
-- should be applicable to others as well if the need
-- arises.
--
-- As well as the table, it adds:
--
-- * `queue_item_status`, an ENUM type.
-- * `update_timestamp`, a trigger function.
-- * Two triggers on `queue_items`.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It will fail harmlessly if applied twice.
-- Queues are global, not per project,
-- so they go in the `public` schema.
CREATE TYPE queue_item_status
AS ENUM (
'queued',
'cancelled',
'failed',
'sent'
);
CREATE TABLE IF NOT EXISTS queue_items (
item_id serial NOT NULL PRIMARY KEY,
-- One day we may want multiple queues, in that case we will
-- have a queue_id and a relation of queue definitions.
-- But not right now.
-- queue_id integer NOT NULL REFERENCES queues (queue_id),
status queue_item_status NOT NULL DEFAULT 'queued',
payload jsonb NOT NULL,
results jsonb NOT NULL DEFAULT '{}'::jsonb,
created_on timestamptz NOT NULL DEFAULT current_timestamp,
updated_on timestamptz NOT NULL DEFAULT current_timestamp,
not_before timestamptz NOT NULL DEFAULT '1970-01-01T00:00:00Z',
parent_id integer NULL REFERENCES queue_items (item_id)
);
-- Sets `updated_on` to current_timestamp unless an explicit
-- timestamp is part of the update.
--
-- This function can be reused with any table that has (or could have)
-- an `updated_on` column of time timestamptz.
CREATE OR REPLACE FUNCTION update_timestamp () RETURNS trigger AS
$$
BEGIN
IF NEW.updated_on IS NOT NULL THEN
NEW.updated_on := current_timestamp;
END IF;
RETURN NEW;
EXCEPTION
WHEN undefined_column THEN RETURN NEW;
END;
$$
LANGUAGE plpgsql;
CREATE TRIGGER queue_items_tg0
BEFORE INSERT OR UPDATE ON public.queue_items
FOR EACH ROW EXECUTE FUNCTION public.update_timestamp();
CREATE TRIGGER queue_items_tg1
AFTER INSERT OR DELETE OR UPDATE ON public.queue_items
FOR EACH ROW EXECUTE FUNCTION public.notify('queue_items');

View File

@@ -0,0 +1,24 @@
-- Upgrade the database from commit 74b3de5c to commit 83be83e4.
--
-- NOTE: This upgrade only affects the `public` schema.
--
-- This inserts a database schema version into the database.
-- Note that we are not otherwise changing the schema, so older
-- server code will continue to run against this version.
--
-- ATTENTION!
--
-- This value should be incremented every time that the database
-- schema changes (either `public` or any of the survey schemas)
-- and is used by the server at start-up to detect if it is
-- running against a compatible schema version.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It can be applied multiple times without ill effect.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.1.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.1.0"}' WHERE public.info.key = 'version';

View File

@@ -0,0 +1,84 @@
-- Upgrade the database from commit 83be83e4 to 53ed096e.
--
-- New schema version: 0.2.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the file hashes to address issue #173.
-- The new hashes use size, modification time, creation time and the
-- first half of the MD5 hex digest of the file's absolute path.
--
-- It's a minor (rather than patch) version number increment because
-- changes to `bin/datastore.py` mean that the data is no longer
-- compatible with the hashing function.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE migrate_hashes (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Migrating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
EXECUTE format('UPDATE %I.files SET hash = array_to_string(array_append(trim_array(string_to_array(hash, '':''), 1), left(md5(path), 16)), '':'')', schema_name);
EXECUTE 'SET search_path TO public'; -- Back to the default search path for good measure
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE upgrade_10 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL migrate_hashes(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL upgrade_10();
CALL show_notice('Cleaning up');
DROP PROCEDURE migrate_hashes (schema_name text);
DROP PROCEDURE upgrade_10 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,189 @@
-- Add function to retrieve sequence/shotpoint from timestamps and vice-versa
--
-- New schema version: 0.2.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Two new functions are defined:
--
-- sequence_shot_from_tstamp(tstamp, [tolerance]) → sequence, point, delta
--
-- Returns a sequence + shotpoint if one falls within `tolerance` seconds
-- of `tstamp`. The tolerance may be omitted in which case it defaults to
-- three seconds. If multiple values match, it returns the closest in time.
--
-- tstamp_from_sequence_shot(sequence, point) → tstamp
--
-- Returns a timestamp given a sequence and point number.
--
-- NOTE: This last function must be called from a search path including a
-- project schema, as it accesses the raw_shots table.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION tstamp_from_sequence_shot(
IN s numeric,
IN p numeric,
OUT "ts" timestamptz)
AS $inner$
SELECT tstamp FROM raw_shots WHERE sequence = s AND point = p LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION tstamp_from_sequence_shot(numeric, numeric)
IS 'Get the timestamp of an existing shotpoint.';
CREATE OR REPLACE FUNCTION tstamp_interpolate(s numeric, p numeric) RETURNS timestamptz
AS $inner$
DECLARE
ts0 timestamptz;
ts1 timestamptz;
pt0 numeric;
pt1 numeric;
BEGIN
SELECT tstamp, point
INTO ts0, pt0
FROM raw_shots
WHERE sequence = s AND point < p
ORDER BY point DESC LIMIT 1;
SELECT tstamp, point
INTO ts1, pt1
FROM raw_shots
WHERE sequence = s AND point > p
ORDER BY point ASC LIMIT 1;
RETURN (ts1-ts0)/abs(pt1-pt0)*abs(p-pt0)+ts0;
END;
$inner$ LANGUAGE PLPGSQL;
COMMENT ON FUNCTION tstamp_interpolate(numeric, numeric)
IS 'Interpolate a timestamp given sequence and point values.
It will try to find the points immediately before and after in the sequence and interpolate into the gap, which may consist of multiple missed shots.
If called on an existing shotpoint it will return an interpolated timestamp as if the shotpoint did not exist, as opposed to returning its actual timestamp.
Returns NULL if it is not possible to interpolate.';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz, numeric)
IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz)
IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.1"}' WHERE public.info.key = 'version';
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,360 @@
-- Add new event log schema.
--
-- New schema version: 0.2.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
-- REQUIRES POSTGRESQL VERSION 14 OR NEWER
-- (Because of CREATE OR REPLACE TRIGGER)
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This is a redesign of the event logging mechanism. The old mechanism
-- relied on a distinction between sequence events (i.e., those which can
-- be associated to a shotpoint within a sequence), timed events (those
-- which occur outside any acquisition sequence) and so-called virtual
-- events (deduced from the data). It was inflexible and inefficient,
-- as most of the time we needed to merge those two types of events into
-- a single view.
--
-- The new mechanism:
-- - uses a single table
-- - accepts sequence event entries for shots or sequences which may not (yet)
-- exist. (https://gitlab.com/wgp/dougal/software/-/issues/170)
-- - keeps edit history (https://gitlab.com/wgp/dougal/software/-/issues/138)
-- - Keeps track of when an entry was made or subsequently edited.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect, as long
-- as the new tables did not previously exist. If they did, they will
-- be emptied before migrating the data.
--
-- WARNING: Applying this upgrade migrates the old event data. It does
-- NOT yet drop the old tables, which is handled in a separate script,
-- leaving the actions here technically reversible without having to
-- restore from backup.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE SEQUENCE IF NOT EXISTS event_log_uid_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE IF NOT EXISTS event_log_full (
-- uid is a unique id for each entry in the table,
-- including revisions of an existing entry.
uid integer NOT NULL PRIMARY KEY DEFAULT nextval('event_log_uid_seq'),
-- All revisions of an entry share the same id.
-- If inserting a new entry, id = uid.
id integer NOT NULL,
-- No default tstamp because, for instance, a user could
-- enter a sequence/point event referring to the future.
-- An external process should scan those at regular intervals
-- and populate the tstamp as needed.
tstamp timestamptz NULL,
sequence integer NULL,
point integer NULL,
remarks text NOT NULL DEFAULT '',
labels text[] NOT NULL DEFAULT ARRAY[]::text[],
-- TODO: Need a geometry column? Let us check performance as it is
-- and if needed either add a geometry column + spatial index.
meta jsonb NOT NULL DEFAULT '{}'::jsonb,
validity tstzrange NOT NULL CHECK (NOT isempty(validity)),
-- We accept either:
-- - Just a tstamp
-- - Just a sequence / point pair
-- - All three
-- We don't accept:
-- - A sequence without a point or vice-versa
-- - Nothing being provided
CHECK (
(tstamp IS NOT NULL AND sequence IS NOT NULL AND point IS NOT NULL) OR
(tstamp IS NOT NULL AND sequence IS NULL AND point IS NULL) OR
(tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL)
)
);
CREATE INDEX IF NOT EXISTS event_log_id ON event_log_full USING btree (id);
CREATE OR REPLACE FUNCTION event_log_full_insert() RETURNS TRIGGER AS $inner$
BEGIN
NEW.id := COALESCE(NEW.id, NEW.uid);
NEW.validity := tstzrange(current_timestamp, NULL);
NEW.meta = COALESCE(NEW.meta, '{}'::jsonb);
NEW.labels = COALESCE(NEW.labels, ARRAY[]::text[]);
IF cardinality(NEW.labels) > 0 THEN
-- Remove duplicates
SELECT array_agg(DISTINCT elements)
INTO NEW.labels
FROM (SELECT unnest(NEW.labels) AS elements) AS labels;
END IF;
RETURN NEW;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_full_insert_tg
BEFORE INSERT ON event_log_full
FOR EACH ROW EXECUTE FUNCTION event_log_full_insert();
-- The public.notify() trigger to alert clients that something has changed
CREATE OR REPLACE TRIGGER event_log_full_notify_tg
AFTER INSERT OR DELETE OR UPDATE
ON event_log_full FOR EACH ROW EXECUTE FUNCTION public.notify('event');
--
-- VIEW event_log
--
-- This is what is exposed to the user most of the time.
-- It shows the current version of records in the event_log_full
-- table.
--
-- The user applies edits to this table directly, which are
-- processed via triggers.
--
CREATE OR REPLACE VIEW event_log AS
SELECT
id, tstamp, sequence, point, remarks, labels, meta,
uid <> id AS has_edits,
lower(validity) AS modified_on
FROM event_log_full
WHERE validity @> current_timestamp;
CREATE OR REPLACE FUNCTION event_log_update() RETURNS TRIGGER AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (OLD.id, NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_tg
INSTEAD OF INSERT OR UPDATE OR DELETE ON event_log
FOR EACH ROW EXECUTE FUNCTION event_log_update();
-- NOTE
-- This is where we migrate the actual data
RAISE NOTICE 'Migrating schema %', schema_name;
-- We start by deleting any data that the new tables might
-- have had if they already existed.
DELETE FROM event_log_full;
-- We purposefully bypass event_log here, as the tables we're
-- migrating from only contain a single version of each event.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
-- This function used the superseded `events` view.
-- We need to drop it because we're changing the return type.
DROP FUNCTION IF EXISTS label_in_sequence (_sequence integer, _label text);
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS event_log
LANGUAGE sql
AS $inner$
SELECT * FROM event_log WHERE sequence = _sequence AND _label = ANY(labels);
$inner$;
-- This function used the superseded `events` view (and a strange logic).
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $inner$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event event_log%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
INSERT INTO event_log (sequence, point, remarks, labels, meta)
VALUES (
-- The sequence
_seq,
-- The shotpoint
_column_value,
-- Remark. Something like "FSP <linename>"
format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)),
-- Label
ARRAY[_label],
-- Meta. Something like {"auto" : {"FSP" : "final_line"}}
json_build_object('auto', json_build_object(_label, _tg_name))
);
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
END IF;
END IF;
END;
$inner$;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_12 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_12();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_12 ();
CALL show_notice('Updating db_schema version');
-- This is technically still compatible with 0.2.0 as we are only adding
-- some more tables and views but not yet dropping the old ones, which we
-- will do separately so that these scripts do not get too big.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Migrate events to new schema
--
-- New schema version: 0.3.0
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the data from the old event log tables to the new schema.
-- It is a *very* good idea to review the data manually after the migration
-- as issues with the logs that had gone unnoticed may become evident now.
--
-- WARNING: If data exists in the new event tables, IT WILL BE TRUNCATED.
--
-- Other than that, this migration is fairly benign as it does not modify
-- the old data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the new event tables while the transaction is active.
--
-- WARNING: This is a minor (not patch) version change, meaning that it requires
-- an upgrade and restart of the backend server.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
TRUNCATE event_log_full;
-- NOTE: meta->>'virtual' = TRUE means that the event was created algorithmically
-- and should not be user editable.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
-- We purposefully bypass event_log here
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
END
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,99 @@
-- Drop old event tables.
--
-- New schema version: 0.3.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This completes the migration from the old event logging mechanism by
-- DROPPING THE OLD DATABASE OBJECTS, MAKING THE MIGRATION IRREVERSIBLE,
-- other than by restoring from backup and manually transferring any new
-- data that may have been created in the meanwhile.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
DROP FUNCTION IF EXISTS
label_in_sequence(integer,text), reset_events_serials();
DROP VIEW IF EXISTS
events_midnight_shot, events_seq_timed, events_labels, "events";
DROP TABLE IF EXISTS
events_seq_labels, events_timed_labels, events_timed_seq, events_seq, events_timed;
DROP SEQUENCE IF EXISTS
events_seq_id_seq, events_timed_id_seq;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.1"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,136 @@
-- Fix project_summary view.
--
-- New schema version: 0.3.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This fixes a problem with the project_summary view. In its common table
-- expression, the view definition tried to search public.projects based on
-- the search path value with the following expression:
--
-- (current_setting('search_path'::text) ~~ (p.schema || '%'::text))
--
-- That is of course bound to fail as soon as the schema goes above `survey_9`
-- because `survey_10 LIKE ('survey_1' || '%')` is TRUE.
--
-- The new mechanism relies on splitting the search_path.
--
-- NOTE: The survey schema needs to be the leftmost element in search_path.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW project_summary AS
WITH fls AS (
SELECT avg((final_lines_summary.duration / ((final_lines_summary.num_points - 1))::double precision)) AS shooting_rate,
avg((final_lines_summary.length / date_part('epoch'::text, final_lines_summary.duration))) AS speed,
sum(final_lines_summary.duration) AS prod_duration,
sum(final_lines_summary.length) AS prod_distance
FROM final_lines_summary
), project AS (
SELECT p.pid,
p.name,
p.schema
FROM public.projects p
WHERE (split_part(current_setting('search_path'::text), ','::text, 1) = p.schema)
)
SELECT project.pid,
project.name,
project.schema,
( SELECT count(*) AS count
FROM preplot_lines
WHERE (preplot_lines.class = 'V'::bpchar)) AS lines,
ps.total,
ps.virgin,
ps.prime,
ps.other,
ps.ntba,
ps.remaining,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp
LIMIT 1) AS fsp,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp DESC
LIMIT 1) AS lsp,
( SELECT count(*) AS count
FROM raw_lines rl) AS seq_raw,
( SELECT count(*) AS count
FROM final_lines rl) AS seq_final,
fls.prod_duration,
fls.prod_distance,
fls.speed AS shooting_rate
FROM preplot_summary ps,
fls,
project;
ALTER TABLE project_summary OWNER TO postgres;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_15 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_15();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_15 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,169 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- The event_log_update() function that gets called when trying to update
-- the event_log view will not work if the caller does provide a timestamp
-- or sequence + point in the list of fields to be updated. See:
-- https://gitlab.com/wgp/dougal/software/-/issues/198
--
-- This fixes the problem by liberally using COALESCE() to merge the OLD
-- and NEW records.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_update() RETURNS trigger
LANGUAGE plpgsql
AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (
OLD.id,
COALESCE(NEW.tstamp, OLD.tstamp),
COALESCE(NEW.sequence, OLD.sequence),
COALESCE(NEW.point, OLD.point),
COALESCE(NEW.remarks, OLD.remarks),
COALESCE(NEW.labels, OLD.labels),
COALESCE(NEW.meta, OLD.meta)
);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$;
CREATE OR REPLACE TRIGGER event_log_tg INSTEAD OF INSERT OR DELETE OR UPDATE ON event_log FOR EACH ROW EXECUTE FUNCTION event_log_update();
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_16 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_16();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_16 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.3"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,163 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new procedure augment_event_data() which tries to
-- populate missing event_log data, namely timestamps and geometries.
--
-- To do this it also adds a function public.geometry_from_tstamp()
-- which, given a timestamp, tries to fetch a geometry from real_time_inputs.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE augment_event_data ()
LANGUAGE sql
AS $inner$
-- Populate the timestamp of sequence / point events
UPDATE event_log_full
SET tstamp = tstamp_from_sequence_shot(sequence, point)
WHERE
tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL;
-- Populate the geometry of sequence / point events for which
-- there is raw_shots data.
UPDATE event_log_full
SET meta = meta ||
jsonb_build_object(
'geometry',
(
SELECT st_transform(geometry, 4326)::jsonb
FROM raw_shots rs
WHERE rs.sequence = event_log_full.sequence AND rs.point = event_log_full.point
)
)
WHERE
sequence IS NOT NULL AND point IS NOT NULL AND
NOT meta ? 'geometry';
-- Populate the geometry of time-based events
UPDATE event_log_full e
SET
meta = meta || jsonb_build_object('geometry',
(SELECT st_transform(g.geometry, 4326)::jsonb
FROM geometry_from_tstamp(e.tstamp, 3) g))
WHERE
tstamp IS NOT NULL AND
sequence IS NULL AND point IS NULL AND
NOT meta ? 'geometry';
-- Get rid of null geometries
UPDATE event_log_full
SET
meta = meta - 'geometry'
WHERE
jsonb_typeof(meta->'geometry') = 'null';
-- Simplify the GeoJSON when the CRS is EPSG:4326
UPDATE event_log_full
SET
meta = meta #- '{geometry, crs}'
WHERE
meta->'geometry'->'crs'->'properties'->>'name' = 'EPSG:4326';
$inner$;
COMMENT ON PROCEDURE augment_event_data()
IS 'Populate missing timestamps and geometries in event_log_full';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_17 () AS $$
DECLARE
row RECORD;
BEGIN
CALL show_notice('Adding index to real_time_inputs.meta->tstamp');
CREATE INDEX IF NOT EXISTS meta_tstamp_idx
ON public.real_time_inputs
USING btree ((meta->>'tstamp') DESC);
CALL show_notice('Creating function geometry_from_tstamp');
CREATE OR REPLACE FUNCTION public.geometry_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "geometry" geometry,
OUT "delta" numeric)
AS $inner$
SELECT
geometry,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.geometry_from_tstamp(timestamptz, numeric)
IS 'Get geometry from timestamp';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_17();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_17 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.4"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,158 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.5
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- The function label_in_sequence(integer, text) was missing for the
-- production schemas. This patch (re-)defines the function as well
-- as other function that depend on it (otherwise it does not get
-- picked up).
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION label_in_sequence(_sequence integer, _label text) RETURNS event_log
LANGUAGE sql
AS $inner$
SELECT * FROM event_log WHERE sequence = _sequence AND _label = ANY(labels);
$inner$;
-- We need to redefine the functions / procedures that call label_in_sequence
CREATE OR REPLACE PROCEDURE handle_final_line_events(IN _seq integer, IN _label text, IN _column text)
LANGUAGE plpgsql
AS $inner$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event event_log%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
INSERT INTO event_log (sequence, point, remarks, labels, meta)
VALUES (
-- The sequence
_seq,
-- The shotpoint
_column_value,
-- Remark. Something like "FSP <linename>"
format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)),
-- Label
ARRAY[_label],
-- Meta. Something like {"auto" : {"FSP" : "final_line"}}
json_build_object('auto', json_build_object(_label, _tg_name))
);
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
END IF;
END IF;
END;
$inner$;
CREATE OR REPLACE PROCEDURE final_line_post_import(IN _seq integer)
LANGUAGE plpgsql
AS $inner$
BEGIN
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$inner$;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_18 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_18();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_18 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.5"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.5"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,162 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.6
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This optimises geometry_from_tstamp() by many orders of magnitude
-- (issue #241). The redefinition of geometry_from_tstamp() necessitates
-- redefining dependent functions.
--
-- We also drop the index on real_time_inputs.meta->'tstamp' as it is no
-- longer used.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE augment_event_data ()
LANGUAGE sql
AS $inner$
-- Populate the timestamp of sequence / point events
UPDATE event_log_full
SET tstamp = tstamp_from_sequence_shot(sequence, point)
WHERE
tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL;
-- Populate the geometry of sequence / point events for which
-- there is raw_shots data.
UPDATE event_log_full
SET meta = meta ||
jsonb_build_object(
'geometry',
(
SELECT st_transform(geometry, 4326)::jsonb
FROM raw_shots rs
WHERE rs.sequence = event_log_full.sequence AND rs.point = event_log_full.point
)
)
WHERE
sequence IS NOT NULL AND point IS NOT NULL AND
NOT meta ? 'geometry';
-- Populate the geometry of time-based events
UPDATE event_log_full e
SET
meta = meta || jsonb_build_object('geometry',
(SELECT st_transform(g.geometry, 4326)::jsonb
FROM geometry_from_tstamp(e.tstamp, 3) g))
WHERE
tstamp IS NOT NULL AND
sequence IS NULL AND point IS NULL AND
NOT meta ? 'geometry';
-- Get rid of null geometries
UPDATE event_log_full
SET
meta = meta - 'geometry'
WHERE
jsonb_typeof(meta->'geometry') = 'null';
-- Simplify the GeoJSON when the CRS is EPSG:4326
UPDATE event_log_full
SET
meta = meta #- '{geometry, crs}'
WHERE
meta->'geometry'->'crs'->'properties'->>'name' = 'EPSG:4326';
$inner$;
COMMENT ON PROCEDURE augment_event_data()
IS 'Populate missing timestamps and geometries in event_log_full';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
CALL show_notice('Dropping index from real_time_inputs.meta->tstamp');
DROP INDEX IF EXISTS meta_tstamp_idx;
CALL show_notice('Creating function geometry_from_tstamp');
CREATE OR REPLACE FUNCTION public.geometry_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "geometry" geometry,
OUT "delta" numeric)
AS $inner$
SELECT
geometry,
extract('epoch' FROM tstamp - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
tstamp BETWEEN (ts - tolerance * interval '1 second') AND (ts + tolerance * interval '1 second')
ORDER BY abs(extract('epoch' FROM tstamp - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.geometry_from_tstamp(timestamptz, numeric)
IS 'Get geometry from timestamp';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.6"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.6"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,254 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.7
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This updates the adjust_planner() procedure to take into account the
-- new events schema (the `event` view has been replaced by `event_log`).
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CALL pg_temp.show_notice('Replacing adjust_planner() procedure');
CREATE OR REPLACE PROCEDURE adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT data->'planner'
INTO _planner_config
FROM file_data
WHERE data ? 'planner';
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.7"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.7"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,267 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.8
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds event_position() and event_meta() functions which are used
-- to retrieve position or metadata, respectively, given either a timestamp
-- or a sequence / point pair. Intended to be used in the context of #229.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
--
-- event_position(): Fetch event position
--
CREATE OR REPLACE FUNCTION event_position (
tstamp timestamptz, sequence integer, point integer, tolerance numeric
)
RETURNS geometry
AS $$
DECLARE
position geometry;
BEGIN
-- Try and get position by sequence / point first
IF sequence IS NOT NULL AND point IS NOT NULL THEN
-- Try and get the position from final_shots or raw_shots
SELECT COALESCE(f.geometry, r.geometry) geometry
INTO position
FROM raw_shots r LEFT JOIN final_shots f USING (sequence, point)
WHERE r.sequence = event_position.sequence AND r.point = event_position.point;
IF position IS NOT NULL THEN
RETURN position;
ELSIF tstamp IS NULL THEN
-- Get the timestamp for the sequence / point, if we can.
-- It will be used later in the function as we fall back
-- to timestamp based search.
-- We also adjust the tolerance as we're now dealing with
-- an exact timestamp.
SELECT COALESCE(f.tstamp, r.tstamp) tstamp, 0.002 tolerance
INTO tstamp, tolerance
FROM raw_shots r LEFT JOIN final_shots f USING (sequence, point)
WHERE r.sequence = event_position.sequence AND r.point = event_position.point;
END IF;
END IF;
-- If we got here, we better have a timestamp
-- First attempt, get a position from final_shots, raw_shots. This may
-- be redundant if we got here from the position of having a sequence /
-- point without a position, but never mind.
SELECT COALESCE(f.geometry, r.geometry) geometry
INTO position
FROM raw_shots r LEFT JOIN final_shots f USING (sequence, point)
WHERE r.tstamp = event_position.tstamp OR f.tstamp = event_position.tstamp
LIMIT 1; -- Just to be sure
IF position IS NULL THEN
-- Ok, so everything else so far has failed, let's try and get this
-- from real time data. We skip the search via sequence / point and
-- go directly for timestamp.
SELECT geometry
INTO position
FROM geometry_from_tstamp(tstamp, tolerance);
END IF;
RETURN position;
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_position (timestamptz, integer, integer, numeric) IS
'Return the position associated with a sequence / point in the current project or
with a given timestamp. Timestamp that is first searched for in the shot tables
of the current prospect or, if not found, in the real-time data.
Returns a geometry.';
CREATE OR REPLACE FUNCTION event_position (
tstamp timestamptz, sequence integer, point integer
)
RETURNS geometry
AS $$
BEGIN
RETURN event_position(tstamp, sequence, point, 3);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_position (timestamptz, integer, integer) IS
'Overload of event_position with a default tolerance of three seconds.';
CREATE OR REPLACE FUNCTION event_position (
tstamp timestamptz
)
RETURNS geometry
AS $$
BEGIN
RETURN event_position(tstamp, NULL, NULL);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_position (timestamptz) IS
'Overload of event_position (timestamptz, integer, integer) for use when searching by timestamp.';
CREATE OR REPLACE FUNCTION event_position (
sequence integer, point integer
)
RETURNS geometry
AS $$
BEGIN
RETURN event_position(NULL, sequence, point);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_position (integer, integer) IS
'Overload of event_position (timestamptz, integer, integer) for use when searching by sequence / point.';
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
--
-- event_meta(): Fetch event metadata
--
CREATE OR REPLACE FUNCTION event_meta (
tstamp timestamptz, sequence integer, point integer
)
RETURNS jsonb
AS $$
DECLARE
result jsonb;
-- Tolerance is hard-coded, at least until a need to expose arises.
tolerance numeric;
BEGIN
tolerance := 3; -- seconds
-- We search by timestamp if we can, as that's a lot quicker
IF tstamp IS NOT NULL THEN
SELECT meta
INTO result
FROM real_time_inputs rti
WHERE
rti.tstamp BETWEEN (event_meta.tstamp - tolerance * interval '1 second') AND (event_meta.tstamp + tolerance * interval '1 second')
ORDER BY abs(extract('epoch' FROM rti.tstamp - event_meta.tstamp ))
LIMIT 1;
ELSE
SELECT meta
INTO result
FROM real_time_inputs rti
WHERE
(meta->>'_sequence')::integer = event_meta.sequence AND
(meta->>'_point')::integer = event_meta.point
ORDER BY rti.tstamp DESC
LIMIT 1;
END IF;
RETURN result;
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_meta (timestamptz, integer, integer) IS
'Return the real-time event metadata associated with a sequence / point in the current project or
with a given timestamp. Timestamp that is first searched for in the shot tables
of the current prospect or, if not found, in the real-time data.
Returns a JSONB object.';
CREATE OR REPLACE FUNCTION event_meta (
tstamp timestamptz
)
RETURNS jsonb
AS $$
BEGIN
RETURN event_meta(tstamp, NULL, NULL);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_meta (timestamptz) IS
'Overload of event_meta (timestamptz, integer, integer) for use when searching by timestamp.';
CREATE OR REPLACE FUNCTION event_meta (
sequence integer, point integer
)
RETURNS jsonb
AS $$
BEGIN
RETURN event_meta(NULL, sequence, point);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION event_meta (integer, integer) IS
'Overload of event_meta (timestamptz, integer, integer) for use when searching by sequence / point.';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.8"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.8"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,229 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.9
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a replace_placeholders() function, taking as arguments
-- a text string and either a timestamp or a sequence / point pair. It
-- uses the latter arguments to find metadata from which it can extract
-- relevant information and replace it into the text string wherever the
-- appropriate placeholders appear. For instance, given a call such as
-- replace_placeholders('The position is @POS@', NULL, 11, 2600) it will
-- replace '@POS@' with the position of point 2600 in sequence 11, if it
-- exists (or leave the placeholder untouched otherwise).
--
-- A scan_placeholders() procedure is also defined, which calls the above
-- function on the entire event log.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION replace_placeholders (
text_in text, tstamp timestamptz, sequence integer, point integer
)
RETURNS text
AS $$
DECLARE
position geometry;
metadata jsonb;
text_out text;
json_query text;
json_result jsonb;
expect_recursion boolean := false;
BEGIN
text_out := text_in;
-- We only get a position if we are going to need it…
IF regexp_match(text_out, '@DMS@|@POS@|@DEG@') IS NOT NULL THEN
position := ST_Transform(event_position(tstamp, sequence, point), 4326);
END IF;
-- …and likewise with the metadata.
IF regexp_match(text_out, '@BSP@|@WD@|@CMG@|@EN@|@GRID@|@(\$\..*?)@@') IS NOT NULL THEN
metadata := event_meta(tstamp, sequence, point);
END IF;
-- We shortcut the evaluation if neither of the above regexps matched
IF position IS NULL AND metadata IS NULL THEN
RETURN text_out;
END IF;
IF position('@DMS@' IN text_out) != 0 THEN
text_out := replace(text_out, '@DMS@', ST_AsLatLonText(position));
END IF;
IF position('@POS@' IN text_out) != 0 THEN
text_out := replace(text_out, '@POS@', replace(ST_AsLatLonText(position, 'D.DDDDDD'), ' ', ', '));
END IF;
IF position('@DEG@' IN text_out) != 0 THEN
text_out := replace(text_out, '@DEG@', replace(ST_AsLatLonText(position, 'D.DDDDDD'), ' ', ', '));
END IF;
IF position('@EN@' IN text_out) != 0 THEN
IF metadata ? 'easting' AND metadata ? 'northing' THEN
text_out := replace(text_out, '@EN@', (metadata->>'easting') || ', ' || (metadata->>'northing'));
END IF;
END IF;
IF position('@GRID@' IN text_out) != 0 THEN
IF metadata ? 'easting' AND metadata ? 'northing' THEN
text_out := replace(text_out, '@GRID@', (metadata->>'easting') || ', ' || (metadata->>'northing'));
END IF;
END IF;
IF position('@CMG@' IN text_out) != 0 THEN
IF metadata ? 'bearing' THEN
text_out := replace(text_out, '@CMG@', metadata->>'bearing');
END IF;
END IF;
IF position('@BSP@' IN text_out) != 0 THEN
IF metadata ? 'speed' THEN
text_out := replace(text_out, '@BSP@', round((metadata->>'speed')::numeric * 3600 / 1852, 1)::text);
END IF;
END IF;
IF position('@WD@' IN text_out) != 0 THEN
IF metadata ? 'waterDepth' THEN
text_out := replace(text_out, '@WD@', metadata->>'waterDepth');
END IF;
END IF;
json_query := (regexp_match(text_out, '@(\$\..*?)@@'))[1];
IF json_query IS NOT NULL THEN
json_result := jsonb_path_query_array(metadata, json_query::jsonpath);
IF jsonb_array_length(json_result) = 1 THEN
text_out := replace(text_out, '@'||json_query||'@@', json_result->>0);
ELSE
text_out := replace(text_out, '@'||json_query||'@@', json_result::text);
END IF;
-- There might be multiple JSONPath queries, so we may have to recurse
expect_recursion := true;
END IF;
IF expect_recursion IS TRUE AND text_in != text_out THEN
--RAISE NOTICE 'Recursing %', text_out;
-- We don't know if we have found all the JSONPath expression
-- so we do another pass.
RETURN replace_placeholders(text_out, tstamp, sequence, point);
ELSE
RETURN text_out;
END IF;
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION replace_placeholders (text, timestamptz, integer, integer) IS
'Replace certain placeholder strings in the input text with data obtained from shot or real-time data.';
CREATE OR REPLACE PROCEDURE scan_placeholders ()
LANGUAGE sql
AS $$
-- We update non read-only events via the event_log view to leave a trace
-- of the fact that placeholders were replaced (and when).
-- Note that this will not replace placeholders of old edits.
UPDATE event_log
SET remarks = replace_placeholders(remarks, tstamp, sequence, point)
FROM (
SELECT id
FROM event_log e
WHERE
(meta->'readonly')::boolean IS NOT TRUE AND (
regexp_match(remarks, '@DMS@|@POS@|@DEG@') IS NOT NULL OR
regexp_match(remarks, '@BSP@|@WD@|@CMG@|@EN@|@GRID@|@(\$\..*?)@@') IS NOT NULL
)
) t
WHERE event_log.id = t.id;
-- And then we update read-only events directly on the event_log_full table
-- (as of this version of the schema we're prevented from updating read-only
-- events via event_log anyway).
UPDATE event_log_full
SET remarks = replace_placeholders(remarks, tstamp, sequence, point)
FROM (
SELECT uid
FROM event_log_full e
WHERE
(meta->'readonly')::boolean IS TRUE AND (
regexp_match(remarks, '@DMS@|@POS@|@DEG@') IS NOT NULL OR
regexp_match(remarks, '@BSP@|@WD@|@CMG@|@EN@|@GRID@|@(\$\..*?)@@') IS NOT NULL
)
) t
WHERE event_log_full.uid = t.uid;
$$;
COMMENT ON PROCEDURE scan_placeholders () IS
'Run replace_placeholders() on the entire event log.';
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.9"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.9"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,127 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.10
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects only the public schema.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a interpolate_geometry_from_tstamp(), taking a timestamp
-- and a maximum timespan in seconds. It will then interpolate a position
-- at the exact timestamp based on data from real_time_inputs, provided
-- that the effective interpolation timespan does not exceed the maximum
-- requested.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
BEGIN
CALL pg_temp.show_notice('Defining interpolate_geometry_from_tstamp()');
CREATE OR REPLACE FUNCTION public.interpolate_geometry_from_tstamp(
IN ts timestamptz,
IN maxspan numeric
)
RETURNS geometry
AS $$
DECLARE
ts0 timestamptz;
ts1 timestamptz;
geom0 geometry;
geom1 geometry;
span numeric;
fraction numeric;
BEGIN
SELECT tstamp, geometry
INTO ts0, geom0
FROM real_time_inputs
WHERE tstamp <= ts
ORDER BY tstamp DESC
LIMIT 1;
SELECT tstamp, geometry
INTO ts1, geom1
FROM real_time_inputs
WHERE tstamp >= ts
ORDER BY tstamp ASC
LIMIT 1;
IF geom0 IS NULL OR geom1 IS NULL THEN
RAISE NOTICE 'Interpolation failed (no straddling data)';
RETURN NULL;
END IF;
-- See if we got an exact match
IF ts0 = ts THEN
RETURN geom0;
ELSIF ts1 = ts THEN
RETURN geom1;
END IF;
span := extract('epoch' FROM ts1 - ts0);
IF span > maxspan THEN
RAISE NOTICE 'Interpolation timespan % outside maximum requested (%)', span, maxspan;
RETURN NULL;
END IF;
fraction := extract('epoch' FROM ts - ts0) / span;
IF fraction < 0 OR fraction > 1 THEN
RAISE NOTICE 'Requested timestamp % outside of interpolation span (fraction: %)', ts, fraction;
RETURN NULL;
END IF;
RETURN ST_LineInterpolatePoint(St_MakeLine(geom0, geom1), fraction);
END;
$$ LANGUAGE plpgsql;
COMMENT ON FUNCTION public.interpolate_geometry_from_tstamp(timestamptz, numeric) IS
'Interpolate a position over a given maximum timespan (in seconds)
based on real-time inputs. Returns a POINT geometry.';
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.10"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.10"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,149 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.11
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This redefines augment_event_data() to use interpolation rather than
-- nearest neighbour. It now takes an argument indicating the maximum
-- allowed interpolation timespan. An overload with a default of ten
-- minutes is also provided, as an in situ replacement for the previous
-- version.
--
-- The ten minute default is based on Triggerfish headers behaviour seen
-- on crew 248 during soft starts.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE augment_event_data (maxspan numeric)
LANGUAGE sql
AS $$
-- Populate the timestamp of sequence / point events
UPDATE event_log_full
SET tstamp = tstamp_from_sequence_shot(sequence, point)
WHERE
tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL;
-- Populate the geometry of sequence / point events for which
-- there is raw_shots data.
UPDATE event_log_full
SET meta = meta ||
jsonb_build_object(
'geometry',
(
SELECT st_transform(geometry, 4326)::jsonb
FROM raw_shots rs
WHERE rs.sequence = event_log_full.sequence AND rs.point = event_log_full.point
)
)
WHERE
sequence IS NOT NULL AND point IS NOT NULL AND
NOT meta ? 'geometry';
-- Populate the geometry of time-based events
UPDATE event_log_full e
SET
meta = meta || jsonb_build_object('geometry',
(SELECT st_transform(g.geometry, 4326)::jsonb
FROM interpolate_geometry_from_tstamp(e.tstamp, maxspan) g))
WHERE
tstamp IS NOT NULL AND
sequence IS NULL AND point IS NULL AND
NOT meta ? 'geometry';
-- Get rid of null geometries
UPDATE event_log_full
SET
meta = meta - 'geometry'
WHERE
jsonb_typeof(meta->'geometry') = 'null';
-- Simplify the GeoJSON when the CRS is EPSG:4326
UPDATE event_log_full
SET
meta = meta #- '{geometry, crs}'
WHERE
meta->'geometry'->'crs'->'properties'->>'name' = 'EPSG:4326';
$$;
COMMENT ON PROCEDURE augment_event_data(numeric)
IS 'Populate missing timestamps and geometries in event_log_full';
CREATE OR REPLACE PROCEDURE augment_event_data ()
LANGUAGE sql
AS $$
CALL augment_event_data(600);
$$;
COMMENT ON PROCEDURE augment_event_data()
IS 'Overload of augment_event_data(maxspan numeric) with a maxspan value of 600 seconds.';
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.11"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.11"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,193 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.12
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a midnight_shots view and a log_midnight_shots() procedure
-- (with some overloads). The view returns all points straddling midnight
-- UTC and belonging to the same sequence (so last shot of the day and
-- first shot of the next day).
--
-- The procedure inserts the corresponding events (optionally constrained
-- by an earliest and a latest date) in the event log, unless the events
-- already exist.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW midnight_shots AS
WITH straddlers AS (
-- Get sequence numbers straddling midnight UTC
SELECT sequence
FROM final_shots
GROUP BY sequence
HAVING min(date(tstamp)) != max(date(tstamp))
),
ts AS (
-- Get earliest and latest timestamps for each day
-- for each of the above sequences.
-- This will return the timestamps for:
-- FSP, LDSP, FDSP, LSP.
SELECT
fs.sequence,
min(fs.tstamp) AS ts0,
max(fs.tstamp) AS ts1
FROM final_shots fs INNER JOIN straddlers USING (sequence)
GROUP BY fs.sequence, (date(fs.tstamp))
ORDER BY fs.sequence, date(fs.tstamp)
),
spts AS (
-- Filter out FSP, LSP from the above.
-- NOTE: This *should* in theory be able to cope with
-- a sequence longer than 24 hours (so with more than
-- one LDSP, FDSP) but that hasn't been tested.
SELECT DISTINCT
sequence,
min(ts1) OVER (PARTITION BY sequence) ldsp,
max(ts0) OVER (PARTITION BY sequence) fdsp
FROM ts
ORDER BY sequence
), evt AS (
SELECT
fs.tstamp,
fs.sequence,
point,
'Last shotpoint of the day' remarks,
'{LDSP}'::text[] labels
FROM final_shots fs
INNER JOIN spts ON fs.sequence = spts.sequence AND fs.tstamp = spts.ldsp
UNION SELECT
fs.tstamp,
fs.sequence,
point,
'First shotpoint of the day' remarks,
'{FDSP}'::text[] labels
FROM final_shots fs
INNER JOIN spts ON fs.sequence = spts.sequence AND fs.tstamp = spts.fdsp
ORDER BY tstamp
)
SELECT * FROM evt;
CREATE OR REPLACE PROCEDURE log_midnight_shots (dt0 date, dt1 date)
LANGUAGE sql
AS $$
INSERT INTO event_log (sequence, point, remarks, labels, meta)
SELECT
sequence, point, remarks, labels,
'{"auto": true, "insertedBy": "log_midnight_shots"}'::jsonb
FROM midnight_shots ms
WHERE
(dt0 IS NULL OR ms.tstamp >= dt0) AND
(dt1 IS NULL OR ms.tstamp <= dt1) AND
NOT EXISTS (
SELECT 1
FROM event_log el
WHERE ms.sequence = el.sequence AND ms.point = el.point AND el.labels @> ms.labels
);
-- Delete any midnight shots that might have been inserted in the log
-- but are no longer relevant according to the final_shots data.
-- We operate on event_log, so the deletion is traceable.
DELETE
FROM event_log
WHERE id IN (
SELECT id
FROM event_log el
LEFT JOIN midnight_shots ms USING (sequence, point)
WHERE
'{LDSP,FDSP}'::text[] && el.labels -- &&: Do the arrays overlap?
AND ms.sequence IS NULL
);
$$;
COMMENT ON PROCEDURE log_midnight_shots (date, date)
IS 'Add midnight shots between two dates dt0 and dt1 to the event_log, unless the events already exist.';
CREATE OR REPLACE PROCEDURE log_midnight_shots (dt0 date)
LANGUAGE sql
AS $$
CALL log_midnight_shots(dt0, NULL);
$$;
COMMENT ON PROCEDURE log_midnight_shots (date)
IS 'Overload taking only a dt0 (adds events on that date or after).';
CREATE OR REPLACE PROCEDURE log_midnight_shots ()
LANGUAGE sql
AS $$
CALL log_midnight_shots(NULL, NULL);
$$;
COMMENT ON PROCEDURE log_midnight_shots ()
IS 'Overload taking no arguments (adds all missing events).';
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.12"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.12"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,162 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.3.13
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Fixes a bug in the `final_lines_summary` and `raw_lines_summary` views
-- which results in the number of missing shots being miscounted on jobs
-- using three sources.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(SELECT count(*) AS count
FROM missing_sequence_raw_points
WHERE missing_sequence_raw_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
( SELECT count(*) AS count
FROM missing_sequence_final_points
WHERE missing_sequence_final_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.13"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.13"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,122 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adapts the schema to the change in how project configurations are
-- handled (https://gitlab.com/wgp/dougal/software/-/merge_requests/29)
-- by creating a project_configuration() function which returns the
-- current project's configuration data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION project_configuration()
RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
schema_name text;
configuration jsonb;
BEGIN
SELECT nspname
INTO schema_name
FROM pg_namespace
WHERE oid = (
SELECT pronamespace
FROM pg_proc
WHERE oid = 'project_configuration'::regproc::oid
);
SELECT meta
INTO configuration
FROM public.projects
WHERE schema = schema_name;
RETURN configuration;
END
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' AND current_db_version != '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.0"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,264 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies adjust_planner() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT project_configuration()->'planner'
INTO _planner_config;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.1"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies binning_parameters() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION binning_parameters() RETURNS jsonb
LANGUAGE sql STABLE LEAKPROOF PARALLEL SAFE
AS $$
SELECT project_configuration()->'binning' binning;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,164 @@
-- Support notification payloads larger than Postgres' NOTIFY limit.
--
-- New schema version: 0.4.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema only.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new table where large notification payloads are stored
-- temporarily and from which they might be recalled by the notification
-- listeners. It also creates a purge_notifications() procedure used to
-- clean up old notifications from the notifications log and finally,
-- modifies notify() to support these changes. When a large payload is
-- encountered, the payload is stored in the notify_payloads table and
-- a trimmed down version containing a notification_id is sent to listeners
-- instead. Listeners can then query notify_payloads to retrieve the full
-- payloads. It is the application layer's responsibility to delete old
-- notifications.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_schema () AS $outer$
BEGIN
RAISE NOTICE 'Updating public schema';
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO public');
CREATE TABLE IF NOT EXISTS public.notify_payloads (
id SERIAL,
tstamp timestamptz NOT NULL DEFAULT CURRENT_TIMESTAMP,
payload text NOT NULL DEFAULT '',
PRIMARY KEY (id)
);
CREATE INDEX IF NOT EXISTS notify_payload_tstamp ON notify_payloads (tstamp);
CREATE OR REPLACE FUNCTION public.notify() RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE
channel text := TG_ARGV[0];
pid text;
payload text;
notification text;
payload_id integer;
BEGIN
SELECT projects.pid INTO pid FROM projects WHERE schema = TG_TABLE_SCHEMA;
payload := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'old', row_to_json(OLD),
'new', row_to_json(NEW),
'pid', pid
)::text;
IF octet_length(payload) < 1000 THEN
PERFORM pg_notify(channel, payload);
ELSE
-- We need to find another solution
-- FIXME Consider storing the payload in a temporary memory table,
-- referenced by some form of autogenerated ID. Then send the ID
-- as the payload and then it's up to the user to fetch the original
-- payload if interested. This needs a mechanism to expire older payloads
-- in the interest of conserving memory.
INSERT INTO notify_payloads (payload) VALUES (payload) RETURNING id INTO payload_id;
notification := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'pid', pid,
'payload_id', payload_id
)::text;
PERFORM pg_notify(channel, notification);
RAISE INFO 'Payload over limit';
END IF;
RETURN NULL;
END;
$$;
CREATE PROCEDURE public.purge_notifications (age_seconds numeric DEFAULT 120) AS $$
DELETE FROM notify_payloads WHERE EXTRACT(epoch FROM CURRENT_TIMESTAMP - tstamp) > age_seconds;
$$ LANGUAGE sql;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
-- This upgrade modified the `public` schema only, not individual
-- project schemas.
CALL pg_temp.upgrade_schema();
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_schema ();
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.3"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,104 @@
-- Add event_log_changes function
--
-- New schema version: 0.4.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds a function event_log_changes which returns the subset of
-- events from event_log_full which have been modified on or after a
-- given timestamp.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_changes(ts0 timestamptz)
RETURNS SETOF event_log_full
LANGUAGE sql
AS $$
SELECT *
FROM event_log_full
WHERE lower(validity) > ts0 OR upper(validity) IS NOT NULL AND upper(validity) > ts0
ORDER BY lower(validity);
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.4' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.4"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -7,24 +7,39 @@
id: missing_shots
check: |
const sequence = currentItem;
const sp0 = Math.min(sequence.fsp, sequence.lsp);
const sp1 = Math.max(sequence.fsp, sequence.lsp);
const missing = preplots.filter(r => r.line == sequence.line &&
r.point >= sp0 && r.point <= sp1 &&
!sequence.shots.find(s => s.point == r.point)
);
let results;
if (sequence.missing_shots) {
results = {
shots: {}
}
const missing_shots = missingShotpoints.filter(i => !i.ntba);
for (const shot of missing_shots) {
results.shots[shot.point] = { remarks: "Missed shot", labels: [ "QC", "QCAcq" ] };
}
} else {
results = true;
}
missing.length == 0 || missing.map(r => `Missing shot: ${r.point}`).join("\n")
results;
-
name: "Gun QC"
disabled: false
labels: [ "QC", "QCGuns" ]
children:
-
name: "Sequences without gun data"
iterate: "sequences"
id: seq_no_gun_data
check: |
shotpoints.some(i => i.meta?.raw?.smsrc) || "Sequence has no gun data"
-
name: "Missing gun data"
id: missing_gun_data
ignoreAllFailed: true
check: |
!!currentItem._("raw_meta.smsrc.guns") || "Missing gun data"
!!currentItem._("raw_meta.smsrc.guns")
? true
: "Missing gun data"
-
name: "No fire"
@@ -32,8 +47,8 @@
check: |
const currentShot = currentItem;
const gunData = currentItem._("raw_meta.smsrc");
(gunData && gunData.num_nofire != 0)
? `Source ${gunData.src_number}: No fire (${gunData.num_nofire} guns)`
(gunData && gunData.guns && gunData.guns.length != gunData.num_active)
? `Source ${gunData.src_number}: No fire (${gunData.guns.length - gunData.num_active} guns)`
: true;
-
@@ -47,8 +62,8 @@
.guns
.filter(gun => ((gun[2] == gunData.src_number) && (gun[pressure]/parameters.gunPressureNominal - 1) > parameters.gunPressureToleranceRatio))
.map(gun =>
`source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, pressure: ${gun[pressure]} / ${parameters.gunPressureNominal} = ${(Math.abs(gunData.manifold/parameters.gunPressureNominal - 1)*100).toFixed(1)}% > ${(parameters.gunPressureToleranceRatio*100).toFixed(1)}%`
);
`source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, pressure: ${gun[pressure]} / ${parameters.gunPressureNominal} = ${(Math.abs(gun[pressure]/parameters.gunPressureNominal - 1)*100).toFixed(2)}% > ${(parameters.gunPressureToleranceRatio*100).toFixed(2)}%`
).join(" \n");
results && results.length
? results
: true
@@ -150,7 +165,7 @@
.filter(gun => Math.abs(gun[firetime]-gun[aimpoint]) >= parameters.gunTimingWarning && Math.abs(gun[firetime]-gun[aimpoint]) <= parameters.gunTiming)
.forEach(gun => {
const value = Math.abs(gun[firetime]-gun[aimpoint]);
result.push(`Delta error: source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}: ${parameters.gunTimingWarning} ≤ ${value.toFixed(2)} ≤ ${parameters.gunTiming}`);
result.push(`Delta warning: source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}: ${parameters.gunTimingWarning} ≤ ${value.toFixed(2)} ≤ ${parameters.gunTiming}`);
});
}
if (result.length) {
@@ -192,7 +207,7 @@
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_i) <= parameters.crosslineError
|| `Crossline error: ${currentShot.error_i.toFixed(1)} > ${parameters.crosslineError}`
|| `Crossline error (${currentShot.type}): ${currentShot.error_i.toFixed(2)} > ${parameters.crosslineError}`
-
name: "Inline"
@@ -200,7 +215,7 @@
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_j) <= parameters.inlineError
|| `Inline error: ${currentShot.error_j.toFixed(1)} > ${parameters.inlineError}`
|| `Inline error (${currentShot.type}): ${currentShot.error_j.toFixed(2)} > ${parameters.inlineError}`
-
name: "Centre of source preplot deviation (moving average)"
@@ -213,11 +228,16 @@
id: crossline_average
check: |
const currentSequence = currentItem;
const i_err = currentSequence.shots.filter(s => s.error_i != null).map(a => a.error_i);
//const i_err = shotpoints.filter(s => s.error_i != null).map(a => a.error_i);
const i_err = shotpoints.map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[0]
)
.filter(i => !isNaN(i));
if (i_err.length) {
const avg = i_err.reduce( (a, b) => a+b)/i_err.length;
avg <= parameters.crosslineErrorAverage ||
`Average crossline error: ${avg.toFixed(1)} > ${parameters.crosslineErrorAverage}`
`Average crossline error: ${avg.toFixed(2)} > ${parameters.crosslineErrorAverage}`
} else {
`Sequence ${currentSequence.sequence} has no shots within preplot`
}
@@ -230,16 +250,27 @@
check: |
const currentSequence = currentItem;
const n = parameters.inlineErrorRunningAverageShots; // For brevity
const results = currentSequence.shots.slice(n/2, -n/2).map( (shot, index) => {
const shots = currentSequence.shots.slice(index, index+n).map(i => i.error_j).filter(i => i !== null);
const results = shotpoints.slice(n/2, -n/2).map( (shot, index) => {
const shots = shotpoints.slice(index, index+n).map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[1]
).filter(i => i !== null);
if (!shots.length) {
// We are outside the preplot
// Nothing to see here, move along
return true;
}
const mean = shots.reduce( (a, b) => a+b ) / shots.length;
return Math.abs(mean) <= parameters.inlineErrorRunningAverageValue ||
`Running average inline error: shot ${shot.point}, ${mean.toFixed(1)} > ${parameters.inlineErrorRunningAverageValue}`
return Math.abs(mean) <= parameters.inlineErrorRunningAverageValue || [
shot.point,
{
remarks: `Running average inline error: ${mean.toFixed(2)} > ${parameters.inlineErrorRunningAverageValue}`,
labels: [ "QC", "QCNav" ]
}
]
}).filter(i => i !== true);
results.length == 0 || results.join("\n");
results.length == 0 || {
remarks: "Sequence exceeds inline error running average limit",
shots: Object.fromEntries(results)
}

3
etc/ssl/README.md Normal file
View File

@@ -0,0 +1,3 @@
# TLS certificates directory
Drop TLS certificates required by Dougal in this directory. It is excluded by [`.gitignore`](../../.gitignore) so its contents should never be committed by accident (and shouldn't be committed on purpose!).

View File

@@ -1,6 +1,9 @@
{
"jwt": {
"secret": ""
"secret": "",
"options": {
"expiresIn": 1800
}
},
"db": {
"user": "postgres",

View File

@@ -1,5 +1,8 @@
module.exports = {
presets: [
'@vue/cli-plugin-babel/preset'
],
plugins: [
'@babel/plugin-proposal-logical-assignment-operators'
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -3,30 +3,35 @@
"version": "0.0.0",
"private": true,
"scripts": {
"serve": "vue-cli-service serve",
"serve": "vue-cli-service serve --host=0.0.0.0",
"build": "vue-cli-service build"
},
"dependencies": {
"@mdi/font": "^5.6.55",
"@mdi/font": "^7.2.96",
"core-js": "^3.6.5",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",
"leaflet": "^1.7.1",
"leaflet-arrowheads": "^1.2.2",
"leaflet-realtime": "^2.2.0",
"leaflet.markercluster": "^1.4.1",
"marked": "^2.0.3",
"plotly.js-dist": "^2.5.0",
"suncalc": "^1.8.0",
"typeface-roboto": "0.0.75",
"vue": "^2.6.12",
"vue-debounce": "^2.5.7",
"vue-router": "^3.4.5",
"vuetify": "^2.3.12",
"vuex": "^3.5.1"
"vue-debounce": "^2.6.0",
"vue-router": "^3.5.1",
"vuetify": "^2.5.0",
"vuex": "^3.6.2"
},
"devDependencies": {
"@babel/plugin-proposal-logical-assignment-operators": "^7.14.5",
"@vue/cli-plugin-babel": "~4.4.0",
"@vue/cli-plugin-router": "~4.4.0",
"@vue/cli-plugin-vuex": "~4.4.0",
"@vue/cli-service": "~4.4.0",
"sass": "^1.26.11",
"@vue/cli-service": "^4.5.13",
"sass": "~1.32",
"sass-loader": "^8.0.0",
"stylus": "^0.54.8",
"stylus-loader": "^3.0.2",

View File

@@ -26,9 +26,16 @@
<style lang="stylus">
@import '../node_modules/typeface-roboto/index.css'
@import '../node_modules/@mdi/font/css/materialdesignicons.css'
.markdown.v-textarea textarea
font-family monospace
line-height 1.1 !important
</style>
</style>
<script>
import { mapActions } from 'vuex';
import DougalNavigation from './components/navigation';
import DougalFooter from './components/footer';
@@ -58,12 +65,27 @@ export default {
snackText (newVal) {
this.snack = !!newVal;
},
snack (newVal) {
// When the snack is hidden (one way or another), clear
// the text so that if we receive the same message again
// afterwards it will be shown. This way, if we get spammed
// we're also not triggering the snack too often.
if (!newVal) {
this.$store.commit('setSnackText', "");
}
}
},
methods: {
...mapActions(["setCredentials"])
},
mounted () {
// Local Storage values are always strings
this.$vuetify.theme.dark = localStorage.getItem("darkTheme") == "true";
this.setCredentials()
}
};

View File

@@ -1,6 +1,7 @@
<template>
<v-menu
v-model="show"
:value="value"
@input="(e) => $emit('input', e)"
:position-x="absolute && x || undefined"
:position-y="absolute && y || undefined"
:absolute="absolute"
@@ -20,6 +21,7 @@
<dougal-context-menu v-if="item.items"
:value="showSubmenu"
:items="item.items"
:labels="labels.concat(item.labels||[])"
@input="selected"
submenu>
<template v-slot:activator="{ on, attrs }">
@@ -55,14 +57,14 @@ export default {
props: {
value: { type: [ MouseEvent, Object, Boolean ] },
labels: { type: [ Array ], default: () => [] },
absolute: { type: Boolean, default: false },
submenu: { type: Boolean, default: false },
items: { type: Array, default: [] }
items: { type: Array, default: () => [] }
},
data () {
return {
show: false,
x: 0,
y: 0,
showSubmenu: false
@@ -97,7 +99,12 @@ export default {
selected (item) {
this.show = false;
this.$emit('input', item);
if (typeof item === 'object' && item !== null) {
const labels = this.labels.concat(item.labels??[]);
this.$emit('input', {...item, labels});
} else {
this.$emit('input', item);
}
}
}

View File

@@ -1,406 +0,0 @@
<template>
<v-dialog
v-model="show"
max-width="600px"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title>
<span class="headline">{{ formTitle }}</span>
</v-card-title>
<v-card-text>
<v-container>
<v-row>
<v-col>
<v-textarea
v-model="remarks"
label="Description"
rows="1"
auto-grow
clearable
autofocus
filled
:hint="presetRemarks ? 'Enter your own comment or select a preset one from the menu on the left' : 'Enter a comment'"
@keyup="handleKeys"
>
<template v-slot:prepend v-if="presetRemarks">
<v-icon
title="Select predefined comments"
color="primary"
@click="showRemarksMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-textarea>
<dougal-context-menu
:value="remarksMenu"
@input="addRemark"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-autocomplete
ref="labels"
v-model="labels"
:items="Object.keys(allowedLabels)"
chips
deletable-chips
multiple
label="Labels"
@input="labelSearch=null; $refs.labels.isMenuActive=false"
:search-input.sync="labelSearch"
>
<template v-slot:selection="data">
<v-chip
v-bind="data.attrs"
:input-value="data.selected"
close
@click="data.select"
@click:close="remove(data.item)"
:color="allowedLabels[data.item].view.colour"
:title="allowedLabels[data.item].view.description"
>{{data.item}}</v-chip>
</template>
<template v-slot:prepend v-if="presetLabels">
<v-icon
title="Select labels"
color="primary"
@click="showLabelsMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-switch label="Change time" v-model="timeInput" :disabled="shotInput"></v-switch>
</v-col>
<v-col>
<v-switch label="Enter shotpoint" v-model="shotInput" :disabled="timeInput"></v-switch>
</v-col>
</v-row>
<v-row dense>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsTime" type="time" step="1" label="Time">
</v-text-field>
</v-col>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsDate" type="date" label="Date">
</v-text-field>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-autocomplete
:items="sequenceList"
v-model="sequence"
label="Sequence"
></v-autocomplete>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-text-field v-model="point" type="number" label="Shot">
</v-text-field>
</v-col>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-spacer></v-spacer>
<v-btn color="blue darken-1" text @click="close">Cancel</v-btn>
<v-btn color="blue darken-1" text @click="save" :disabled="!isValid">Save</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
import { withParentProps } from '@/lib/utils'
export default {
name: 'DougalEventEditDialog',
components: {
DougalContextMenu
},
props: {
value: Boolean,
allowedLabels: { type: Object, default: () => {} },
sequences: { type: Object, default: null },
defaultTimestamp: { type: [ Date, String, Number, Function ], default: null },
defaultSequence: { type: Number, default: null },
defaultShotpoint: { type: Number, default: null },
eventMode: { type: String, default: "timed" },
presetRemarks: { type: [ Object, Array ], default: null },
presetLabels: { type: [ Object, Array ], default: null }
},
data () {
const tsNow = new Date;
return {
show: false,
tsDate: tsNow.toISOString().substring(0, 10),
tsTime: tsNow.toISOString().substring(11, 19),
sequenceData: null,
sequence: null,
point: null,
remarks: "",
labels: [],
labelSearch: null,
timer: null,
timeInput: false,
shotInput: false,
remarksMenu: false,
menuX: 0,
menuY: 0,
}
},
computed: {
eventType () {
return this.timeInput
? "timed"
: this.shotInput
? "seq"
: this.eventMode;
},
formTitle () {
if (this.eventType == "seq") {
return `New event at shotpoint ${this.shot.point}`;
} else {
return "New event at time "+this.tstamp.toISOString().replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2");
}
},
defaultTimestampAsDate () {
if (this.defaultTimestamp instanceof Date) {
return this.defaultTimestamp;
} else if (typeof this.defaultTimestamp == 'string') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'number') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'function') {
return new Date(this.defaultTimestamp());
}
},
tstamp () {
return this.timeInput
? new Date(this.tsDate+"T"+this.tsTime+"Z")
: this.defaultTimestampAsDate || new Date();
},
shot () {
return this.shotInput
? { sequence: this.sequence, point: Number(this.point) }
: { sequence: this.defaultSequence, point: this.defaultShotpoint };
},
isTimedEvent () {
return Boolean((this.timeInput && this.tstamp) ||
(this.defaultTimestampAsDate && !this.shotInput));
},
isShotEvent () {
return Boolean((this.shotInput && this.shot.sequence && this.shot.point) ||
(this.defaultSequence && this.defaultShotpoint && !this.timeInput));
},
isValid () {
if (this.isTimedEvent) {
return !isNaN(this.tstamp) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
if (this.isShotEvent) {
return Number(this.sequence) && Number(this.point) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
return false;
},
sequenceList () {
const seq = this.sequences || this.sequenceData || [];
return seq.map(s => s.sequence).sort((a,b) => b-a);
},
eventData () {
if (!this.isValid) {
return null;
}
const data = {}
data.remarks = this.remarks.trim();
if (this.labels) {
data.labels = this.labels;
}
if (this.isTimedEvent) {
data.tstamp = this.tstamp;
} else if (this.isShotEvent) {
data.sequence = this.shot.sequence;
data.point = this.shot.point;
}
return data;
}
},
watch: {
async show (value) {
this.$emit('input', value);
if (value) {
this.updateTimeFields();
await this.updateSequences();
this.sequence = this.defaultSequence;
this.point = this.defaultShotpoint;
this.shotInput = this.eventMode == "seq";
}
},
value (v) {
if (v != this.show) {
this.show = v;
}
}
},
methods: {
clear () {
this.timeInput = false;
this.shotInput = false;
this.remarks = "";
this.labels = [];
},
close () {
this.show = false;
this.clear();
},
save () {
this.$emit('save', this.eventData);
this.close();
},
remove (item) {
this.labels.splice(this.labels.indexOf(item), 1);
},
updateTimeFields () {
const tsNow = new Date;
this.tsDate = tsNow.toISOString().substring(0, 10);
this.tsTime = tsNow.toISOString().substring(11, 19);
},
async updateSequences () {
if (this.sequences == null) {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequenceData = await this.api([url]) || null
}
this.sequence = this.sequenceList.reduce( (a, b) => Math.max(a, b) );
},
showRemarksMenu (e) {
this.remarksMenu = e;
},
addRemark (item) {
const p = withParentProps(item, this.presetRemarks, "items", "labels");
item = p[1]
? Object.assign({labels: p[1]}, item)
: item;
if (item.text) {
if (this.remarks === null) {
this.remarks = "";
}
if (this.remarks.length && this.remarks[this.remarks.length-1] != "\n") {
this.remarks += "\n";
}
this.remarks += item.text;
}
if (item.labels) {
const unique = new Set();
this.labels.concat(item.labels).forEach(l => unique.add(l));
this.labels = [...unique];
}
},
handleKeys (e) {
if (e.ctrlKey && !e.altKey && !e.shiftKey && !e.metaKey && e.keyCode == 13) {
// Ctrl+Enter
if (this.isValid) {
this.save();
}
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,240 @@
<template>
<v-dialog
v-model="dialog"
style="z-index:2020;"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="hover"
icon
small
title="This entry has edits. Click to view history."
:disabled="disabled"
v-on="on"
>
<v-icon small>mdi-playlist-edit</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title class="headline">
Event history
</v-card-title>
<v-card-text>
<p>Event ID: {{ id }}</p>
<v-data-table
dense
class="small"
:headers="headers"
:items="rows"
item-key="uid"
sort-by="uid"
:sort-desc="true"
:loading="loading"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
>
<template v-slot:item.tstamp="{value}">
<span style="white-space:nowrap;" v-if="value">
{{ value.replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2") }}
</span>
</template>
<template v-slot:item.remarks="{item}">
<template>
<div>
<span v-if="item.labels.length">
<v-chip v-for="label in item.labels"
class="mr-1 px-2 underline-on-hover"
x-small
:color="labels[label] && labels[label].view.colour"
:title="labels[label] && labels[label].view.description"
:key="label"
:href="$route.path+'?label='+encodeURIComponent(label)"
>{{label}}</v-chip>
</span>
<span v-html="$options.filters.markdownInline(item.remarks)">
</span>
</div>
</template>
</template>
<template v-slot:item.valid_from="{item}">
<span style="white-space:nowrap;" v-if="item.validity[1]">
{{ item.validity[1].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<template v-slot:item.valid_until="{item}">
<span style="white-space:nowrap;" v-if="item.validity[2]">
{{ item.validity[2].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<!-- Actions column -->
<template v-slot:item.actions="{ item }">
<div style="white-space:nowrap;">
<!-- NOTE Kind of cheating here by assuming that there will be
no items with *future* validity. -->
<template v-if="item.validity[2]">
<v-btn v-if="!item.meta.readonly"
class="hover"
icon
small
title="Restore"
:disabled="loading"
@click=restoreEvent(item)
>
<v-icon small>mdi-history</v-icon>
</v-btn>
<v-btn v-else
class="hover off"
icon
small
title="This event is read-only"
:disabled="loading"
>
<v-icon small>mdi-lock-reset</v-icon>
</v-btn>
</template>
</div>
</template>
</v-data-table>
</v-card-text>
</v-card>
</v-dialog>
</template>
<style scoped>
.hover {
opacity: 0.4;
}
.hover:hover {
opacity: 1;
}
.hover.off:hover {
opacity: 0.4;
}
.small >>> td, .small >>> th {
font-size: 85% !important;
}
</style>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: 'DougalEventEditHistory',
props: {
id: { type: Number },
disabled: { type: Boolean, default: false },
labels: { default: {} }
},
data () {
return {
dialog: false,
rows: [],
headers: [
{
value: "tstamp",
text: "Timestamp",
width: "20ex"
},
{
value: "sequence",
text: "Sequence",
align: "end",
width: "10ex"
},
{
value: "point",
text: "Shotpoint",
align: "end",
width: "10ex"
},
{
value: "remarks",
text: "Text",
width: "100%"
},
{
value: "valid_from",
text: "Valid From"
},
{
value: "valid_until",
text: "Valid Until"
},
{
value: "actions",
text: "Actions",
sortable: false
}
]
};
},
computed: {
...mapGetters(['loading', 'serverEvent'])
},
watch: {
dialog (val) {
if (!val) {
this.rows = [];
} else {
this.getEventHistory();
}
},
async serverEvent (event) {
if (event.channel == "event" &&
(event.payload?.new?.id ?? event.payload?.old?.id) == this.id) {
// The event that we're viewing has been refreshed (possibly by us)
this.getEventHistory();
}
}
},
methods: {
async getEventHistory () {
const url = `/project/${this.$route.params.project}/event/${this.id}`;
this.rows = (await this.api([url]) || []).map(row => {
row.valid_from = row.validity[1] ?? -Infinity;
row.valid_until = row.validity[2] ?? +Infinity;
return row;
});
},
async restoreEvent (item) {
if (item.id) {
const url = `/project/${this.$route.params.project}/event/${item.id}`;
await this.api([url, {
method: "PUT",
body: item // NOTE Sending extra attributes in the body may cause trouble down the line
}]);
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,208 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event labels</v-toolbar-title>
<v-spacer></v-spacer>
<v-btn
icon
@click="$refs.search.focus()"
>
<v-icon>mdi-magnify</v-icon>
</v-btn>
</v-toolbar>
<v-container class="py-0">
<v-row
align="center"
justify="start"
>
<v-col
v-for="(item, i) in selection"
:key="item.text"
class="shrink"
>
<v-chip
:disabled="loading"
small
:color="item.colour"
:title="item.title"
close
@click:close="selection.splice(i, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</v-col>
<v-col v-if="!allSelected"
cols="12"
>
<v-text-field
ref="search"
v-model="search"
full-width
hide-details
label="Search"
single-line
></v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider v-if="!allSelected"></v-divider>
<v-list dense style="max-height:600px;overflow-y:auto;">
<template v-for="item in categories">
<v-list-item v-if="!selection.find(i => i.text == item.text)"
dense
:key="item.text"
:disabled="loading"
@click="selection.push(item)"
>
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</v-list-item>
</template>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn
:loading="loading"
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!dirty"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
export default {
name: 'DougalEventEditLabels',
props: {
value: { default: false },
labels: { type: Object },
selected: {type: Array },
loading: { type: Boolean, default: false }
},
data: () => ({
dialog: false,
search: '',
selection: [],
}),
computed: {
allSelected () {
return this.selection.length === this.items.length
},
dirty () {
// Checks if the arrays have the same elements
return !this.selection.every(i => this.selected.includes(i.text)) ||
!this.selected.every(i => this.selection.find(j => j.text==i));
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.labels).map(this.labelToItem);
}
},
watch: {
value () {
this.dialog = this.value;
if (this.dialog) {
this.$nextTick(() => this.$refs.search?.focus());
}
},
selected () {
this.selection = this.selected.map(this.labelToItem)
},
selection () {
this.search = '';
this.$refs.search?.focus();
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.labels?.[k]?.view?.icon,
colour: this.labels?.[k]?.view?.colour,
title: this.labels?.[k]?.view?.description
};
},
close () {
this.selection = this.selected.map(this.labelToItem)
this.$emit("input", false);
},
save () {
this.$emit("selectionChanged", {labels: this.selection.map(i => i.text)});
this.$emit("input", false);
},
},
}
</script>

View File

@@ -0,0 +1,697 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
@click="(e) => $emit('new', e)"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event</v-toolbar-title>
<v-spacer></v-spacer>
</v-toolbar>
<v-container class="py-0">
<v-row dense>
<v-col>
<v-menu
v-model="dateMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
readonly
v-bind="attrs"
v-on="on"
@change="updateAncillaryData"
></v-text-field>
</template>
<v-date-picker
v-model="tsDate"
@input="dateMenu = false"
></v-date-picker>
</v-menu>
</v-col>
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
type="time"
step="1"
@change="updateAncillaryData"
>
<template v-slot:prepend>
<v-menu
v-model="timeMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-icon v-on="on" v-bind="attrs">mdi-clock-outline</v-icon>
</template>
<v-time-picker
v-model="tsTime"
format="24hr"
></v-time-picker>
</v-menu>
</template>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entrySequence"
type="number"
min="1"
step="1"
label="Sequence"
prepend-icon="mdi-format-list-bulleted"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryPoint"
type="number"
min="1"
step="1"
label="Point"
prepend-icon="mdi-map-marker-circle"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-combobox
ref="remarks"
v-model="entryRemarks"
:disabled="loading"
:search-input.sync="entryRemarksInput"
:items="remarksAvailable"
:filter="searchRemarks"
item-text="text"
return-object
label="Remarks"
hint="Placeholders: @DMS@, @DEG@, @EN@, @WD@, @BSP@, @CMG@, …"
prepend-icon="mdi-text-box-outline"
append-outer-icon="mdi-magnify"
@click:append-outer="(e) => remarksMenu = e"
></v-combobox>
<dougal-context-menu
:value="remarksMenu"
@input="handleRemarksMenu"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-autocomplete
ref="labels"
v-model="entryLabels"
:items="categories"
multiple
menu-props="closeOnClick, closeOnContentClick"
attach
chips
label="Labels"
prepend-icon="mdi-tag-multiple"
append-outer-icon="mdi-magnify"
@click:append-outer="() => $refs.labels.focus()"
>
<template v-slot:selection="{ item, index, select, selected, disabled }">
<v-chip
:disabled="loading"
small
light
:color="item.colour"
:title="item.title"
close
@click:close="entryLabels.splice(index, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</template>
<template v-slot:item="{ item }">
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
light
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entryLatitude"
label="Latitude"
prepend-icon="φ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false/*TODO*/"
title="Click to set position"
@click="1==1/*TODO*/"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
disabled
title="No GNSS available"
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryLongitude"
label="Longitude"
prepend-icon="λ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false"
title="Click to set position"
@click="getPosition"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
title="No GNSS available"
disabled
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider></v-divider>
<v-card-actions>
<v-btn
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-btn v-if="!id && (entrySequence || entryPoint)"
color="info"
text
title="Enter an event by time"
@click="timed"
>
<v-icon left small>mdi-clock-outline</v-icon>
Timed
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
/* https://github.com/vuetifyjs/vuetify/issues/471 */
.v-dialog {
overflow-y: initial;
}
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
function flattenRemarks(items, keywords=[], labels=[]) {
const result = [];
if (items) {
for (const item of items) {
if (!item.items) {
result.push({
text: item.text,
labels: labels.concat(item.labels??[]),
keywords
})
} else {
const k = [...keywords, item.text];
const l = [...labels, ...(item.labels??[])];
result.push(...flattenRemarks(item.items, k, l))
}
}
}
return result;
}
/** Compare two arrays
*
* @a a First array
* @a b Second array
* @a cbB Callback to transform elements of `b`
*
* @return true if the sets are distinct, false otherwise
*
* Note that this will not work with object or other complex
* elements unless the array members are the same object (as
* opposed to merely identical).
*/
function distinctSets(a, b, cbB = (i) => i) {
return !a.every(i => b.map(cbB).includes(i)) ||
!b.map(cbB).every(i => a.find(j => j==i));
}
export default {
name: 'DougalEventEdit',
components: {
DougalContextMenu
},
props: {
value: { default: false },
availableLabels: { type: Object, default: () => ({}) },
presetRemarks: { type: Array, default: () => [] },
id: { type: Number },
tstamp: { type: String },
sequence: { type: Number },
point: { type: Number },
remarks: { type: String },
labels: { type: Array, default: () => [] },
latitude: { type: Number },
longitude: { type: Number },
loading: { type: Boolean, default: false }
},
data: () => ({
dateMenu: false,
timeMenu: false,
remarksMenu: false,
search: '',
entryLabels: [],
tsDate: null,
tsTime: null,
entrySequence: null,
entryPoint: null,
entryRemarks: null,
entryRemarksInput: null,
entryLatitude: null,
entryLongitude: null
}),
computed: {
remarksAvailable () {
return this.entryRemarksInput == this.entryRemarks?.text ||
this.entryRemarksInput == this.entryRemarks
? []
: flattenRemarks(this.presetRemarks);
},
allSelected () {
return this.entryLabels.length === this.items.length
},
dirty () {
// Selected remark distinct from input remark
if (this.entryRemarksText != this.remarks) {
return true;
}
// The user is editing the remarks
if (this.entryRemarksText != this.entryRemarksInput) {
return true;
}
// Selected label set distinct from input labels
if (distinctSets(this.selectedLabels, this.entryLabels, (i) => i.text)) {
return true;
}
// Selected seqpoint distinct from input seqpoint (if seqpoint present)
if ((this.entrySequence || this.entryPoint)) {
if (this.entrySequence != this.sequence || this.entryPoint != this.point) {
return true;
}
} else {
// Selected timestamp distinct from input timestamp (if no seqpoint)
const epoch = Date.parse(this.tstamp);
const entryEpoch = Date.parse(`${this.tsDate} ${this.tsTime}Z`);
// Ignore difference of less than one second
if (Math.abs(entryEpoch - epoch) > 1000) {
return true;
}
}
return false;
},
canSave () {
// There is either tstamp or seqpoint, latter wins
if (!(this.entrySequence && this.entryPoint) && !this.entryTstamp) {
return false;
}
// There are remarks and/or labels
if (!this.entryRemarksText && !this.entryLabels.length) {
return false;
}
// Form is dirty
if (!this.dirty) {
return false;
}
return true;
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.availableLabels).map(this.labelToItem);
},
selectedLabels () {
return this.event?.labels ?? [];
},
entryTstamp () {
const ts = new Date(Date.parse(`${this.tsDate} ${this.tsTime}Z`));
if (isNaN(ts)) {
return null;
}
return ts.toISOString();
},
entryRemarksText () {
return typeof this.entryRemarks === 'string'
? this.entryRemarks
: this.entryRemarks?.text;
}
},
watch: {
value () {
if (this.value) {
// Populate fields from properties
if (!this.tstamp && !this.sequence && !this.point) {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
} else if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
}
// NOTE Dead code
if (this.meta?.geometry?.type == "Point") {
this.entryLongitude = this.meta.geometry.coordinates[0];
this.entryLatitude = this.meta.geometry.coordinates[1];
}
this.entryLatitude = this.latitude;
this.entryLongitude = this.longitude;
this.entrySequence = this.sequence;
this.entryPoint = this.point;
this.entryRemarks = this.remarks;
this.entryLabels = [...(this.labels??[])];
// Focus remarks field
this.$nextTick(() => this.$refs.remarks.focus());
}
},
tstamp () {
if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
} else if (this.sequence || this.point) {
this.tsDate = null;
this.tsTime = null;
} else {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
}
},
sequence () {
if (this.sequence && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
point () {
if (this.point && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
entryTstamp (n, o) {
//this.updateAncillaryData();
},
entrySequence (n, o) {
//this.updateAncillaryData();
},
entryPoint (n, o) {
//this.updateAncillaryData();
},
entryRemarks () {
if (this.entryRemarks?.labels) {
this.entryLabels = [...this.entryRemarks.labels];
} else if (!this.entryRemarks) {
this.entryLabels = [];
}
},
selectedLabels () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
},
entryLabels () {
this.search = '';
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.availableLabels[k].view?.icon,
colour: this.availableLabels[k].view?.colour,
title: this.availableLabels[k].view?.description
};
},
searchRemarks (item, queryText, itemText) {
const needle = queryText.toLowerCase();
const text = item.text.toLowerCase();
const keywords = item.keywords.map(i => i.toLowerCase());
const labels = item.labels.map(i => i.toLowerCase());
return text.includes(needle) ||
keywords.some(i => i.includes(needle)) ||
labels.some(i => i.includes(needle));
},
handleRemarksMenu (event) {
if (typeof event == 'boolean') {
this.remarksMenu = event;
} else {
this.entryRemarks = event;
this.remarksMenu = false;
}
},
async getPointData () {
const url = `/project/${this.$route.params.project}/sequence/${this.entrySequence}/${this.entryPoint}`;
return await this.api([url]);
},
async getTstampData () {
const url = `/navdata?q=tstamp:${this.entryTstamp}&tolerance:2500`;
return await this.api([url]);
},
async updateAncillaryData () {
if (this.entrySequence && this.entryPoint) {
// Fetch data for this sequence / point
const data = await this.getPointData();
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.geometry) {
this.entryLongitude = (data?.geometry?.coordinates??[])[0];
this.entryLatitude = (data?.geometry?.coordinates??[])[1];
}
} else if (!this.entrySequence && !this.entryPoint && this.entryTstamp) {
// Fetch data for this timestamp
const data = ((await this.getTstampData())??[])[0];
console.log("TS DATA", data);
if (data?._sequence && data?.shot) {
this.entrySequence = Number(data._sequence);
this.entryPoint = data.shot;
}
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.longitude && data?.latitude) {
this.entryLongitude = data.longitude;
this.entryLatitude = data.latitude;
}
}
},
timed () {
const tstamp = (new Date()).toISOString();
this.entrySequence = null;
this.entryPoint = null;
this.tsDate = tstamp.substr(0, 10);
this.tsTime = tstamp.substr(11, 8);
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);
},
save () {
// In case the focus goes directly from the remarks field
// to the Save button.
if (this.entryRemarksInput != this.entryRemarksText) {
this.entryRemarks = this.entryRemarksInput;
}
const data = {
id: this.id,
remarks: this.entryRemarksText,
labels: this.entryLabels
};
/* NOTE This is the purist way.
* Where we expect that the server will match
* timestamps with shotpoints and so on
*
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
} else {
data.tstamp = this.entryTstamp;
}
*/
/* NOTE And this is the pragmatic way.
*/
data.tstamp = this.entryTstamp;
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
}
this.$emit("changed", data);
this.$emit("input", false);
},
...mapActions(["api"])
},
}
</script>

View File

@@ -11,7 +11,7 @@
<v-icon v-if="serverConnected" class="mr-6" small title="Connected to server">mdi-lan-connect</v-icon>
<v-icon v-else class="mr-6" small color="red" title="Server connection lost (we'll reconnect automatically when the server comes back)">mdi-lan-disconnect</v-icon>
<dougal-notifications-control class="mr-6"></dougal-notifications-control>
<div title="Night mode">
@@ -31,7 +31,7 @@
font-family: "Bank Gothic Medium";
src: local("Bank Gothic Medium"), url("/fonts/bank-gothic-medium.woff");
}
.brand {
font-family: "Bank Gothic Medium";
}
@@ -56,7 +56,7 @@ export default {
const date = new Date();
return date.getUTCFullYear();
},
...mapState({serverConnected: state => state.notify.serverConnected})
}
};

View File

@@ -0,0 +1,363 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Array inline / crossline error
<v-spacer></v-spacer>
<v-switch v-model="scatterplot" label="Scatterplot"></v-switch>
<v-switch class="ml-4" v-model="histogram" label="Histogram"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graph0"></div>
</v-col>
</v-row>
<v-row v-show="scatterplot">
<v-col>
<div class="graph-container" ref="graph1"></div>
</v-col>
</v-row>
<v-row v-show="histogram">
<v-col>
<div class="graph-container" ref="graph2"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
background-color: red;
width: 100%;
height: 100%;
}
</style>
<script>
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
export default {
name: 'DougalGraphArraysIJScatter',
props: [ "data", "settings" ],
data () {
return {
graph: [],
busy: false,
resizeObserver: null,
scatterplot: false,
histogram: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
histogram () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.histogram`]: this.histogram});
},
scatterplot () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.scatterplot`]: this.scatterplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.histogram) {
this.plotHistogram();
}
if (this.scatterplot) {
this.plotScatter();
}
},
plotSeries () {
if (!this.data) {
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(d, "point");
const y = unpack(coords, idx);
const data = {
type: "scatter",
mode: "lines",
x,
y,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {line: {color: "green"}}},
{target: 2, value: {line: {color: "red"}}}
]
}],
...otherParams
};
return data;
}
const data = [
transform(this.data.items, 1, {
xaxis: 'x',
yaxis: 'y',
name: 'Crossline'
}),
transform(this.data.items, 0, {
xaxis: 'x',
yaxis: 'y2',
name: 'Inline'
})
];
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Inline / crossline error sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis2: {
title: "Crossline (m)",
anchor: "y2",
domain: [ 0.55, 1 ]
},
yaxis: {
title: "Inline (m)",
anchor: "y1",
domain: [ 0, 0.45 ]
},
xaxis: {
title: "Shotpoint",
anchor: "x1"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[0] = Plotly.newPlot(this.$refs.graph0, data, layout, config);
},
plotScatter () {
console.log("plot");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
function transform (d) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(coords, 0);
const y = unpack(coords, 1);
const data = [{
type: "scatter",
mode: "markers",
x,
y,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {line: {color: "green"}}},
{target: 2, value: {line: {color: "red"}}}
]
}]
}];
return data;
}
const data = transform(this.data.items);
this.busy = false;
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Inline (m)",
//zeroline: false
},
xaxis: {
title: "Crossline (m)"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[1] = Plotly.newPlot(this.$refs.graph1, data, layout, config);
},
plotHistogram () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(coords, idx);
const data = {
type: "histogram",
histnorm: 'probability',
x,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {marker: {color: "rgba(129, 199, 132, 0.9)"}}},
{target: 2, value: {marker: {color: "rgba(229, 115, 115, 0.9)"}}}
]
}],
...otherParams
};
return data;
}
const data = [
transform(this.data.items, 0, {
xaxis: 'x',
yaxis: 'y',
name: 'Crossline'
}),
transform(this.data.items, 1, {
xaxis: 'x2',
yaxis: 'y',
name: 'Inline'
})
];
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
legend: {
title: { text: "Array" }
},
xaxis: {
title: "Crossline distance (m)",
domain: [ 0, 0.45 ],
anchor: 'x1'
},
yaxis: {
title: "Frequency (01)",
domain: [ 0, 1 ],
anchor: 'y1'
},
xaxis2: {
title: "Inline distance (m)",
domain: [ 0.55, 1 ],
anchor: 'x2'
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.busy = false;
console.log(data);
console.log(layout);
this.graph[2] = Plotly.newPlot(this.$refs.graph2, data, layout, config);
},
replot () {
if (!this.graph.length) {
return;
}
console.log("Replotting");
this.graph.forEach( (graph, idx) => {
const ref = this.$refs["graph"+idx];
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
});
},
},
async mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graph0);
this.resizeObserver.observe(this.$refs.graph1);
this.resizeObserver.observe(this.$refs.graph2);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graph2);
this.resizeObserver.unobserve(this.$refs.graph1);
this.resizeObserver.unobserve(this.$refs.graph0);
}
}
};
</script>

View File

@@ -0,0 +1,364 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun depth
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunDepths = guns.map(s => s.map(g => g[10]));
const gunDepthsSorted = gunDepths.map(s => d3a.sort(s));
const gunsAvgDepth = gunDepths.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunDepths = [{
type: "scatter",
mode: "lines",
x,
y: gunDepthsSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsAvgDepth,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunDepthsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsDepthsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunDepthsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunDepthsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunDepths, tracesGunsDepthsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun depths sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Depth (m)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const depths = unpack(guns, 10);
const data = [{
type: "bar",
x: gunIds,
y: depths,
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun depths shot %{meta.point}"},
height: 300,
yaxis: {
title: "Depth (m)",
range: [ Math.min(d3a.min(depths)-0.1, 5), Math.max(d3a.max(depths)+0.1, 7) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 10), // Gun depth
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun depths sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Depth (m)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,405 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun details
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphHeat"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
// TODO: aspects should be a prop
aspects: [
"Mode", "Detect", "Autofire", "Aimpoint", "Firetime", "Delay",
"Delta",
"Depth", "Pressure", "Volume", "Filltime"
]
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
}
},
methods: {
plot () {
this.plotHeat();
},
async plotHeat () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (data, aspects=["Depth", "Pressure"]) {
const facets = [
// Mode
{
params: {
name: "Mode",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Off", "Auto", "Manual", "Disabled" ],
conversion: (gun, shot) => {
switch (gun[3]) {
case "A":
return 1;
case "M":
return 2;
case "O":
return 0;
case "D":
return 3;
}
}
},
// Detect
{
params: {
name: "Detect",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Zero", "Peak", "Level" ],
conversion: (gun, shot) => {
switch (gun[4]) {
case "P":
return 1;
case "Z":
return 0;
case "L":
return 2;
}
}
},
// Autofire
{
params: {
name: "Autofire",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "False", "True" ],
conversion: (gun, shot) => {
return gun[5] ? 1 : 0;
}
},
// Aimpoint
{
params: {
name: "Aimpoint",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[7]
},
// Firetime
{
params: {
name: "Firetime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[8] : null
},
// Delta
{
params: {
name: "Delta",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms",
// NOTE: These values are based on
// Grane + Snorre's ±1.5 ms spec. While a fairly
// common range, I still consider these min / max
// numbers to have been chosen semi-arbitrarily.
zmin: -2,
zmax: 2
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[7]-gun[8] : null
},
// Delay
{
params: {
name: "Delay",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[9]
},
// Depth
{
params: {
name: "Depth",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} m"
},
conversion: (gun, shot) => gun[10]
},
// Pressure
{
params: {
name: "Pressure",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} psi"
},
conversion: (gun, shot) => gun[11]
},
// Volume
{
params: {
name: "Volume",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} in³"
},
conversion: (gun, shot) => gun[12]
},
// Filltime
{
params: {
name: "Filltime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
// NOTE that filltime is applicable to the *non* firing guns
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? null : gun[13]
}
];
// Get gun numbers
const guns = [...new Set(data.map( s => s.meta.guns.map( g => g[1] ) ).flat())];
// z eventually will have the structure:
// z = {
// [aspect]: [ // First shotpoint
// [ // Value for gun 0, gun 1, … ],
// …more shotpoints…
// ]
// }
const z = {};
// x is an array of shotpoints
const x = [];
// y is an array of gun numbers
const y = guns.map( gun => `G${gun}` );
// Build array of guns (i.e., populate z)
// We prefer to do this outside the shot-to-shot loop
// for efficiency
for (const facet of facets) {
const label = facet.params.name;
z[label] = Array(guns.length);
for (let i=0; i<guns.length; i++) {
z[label][i] = [];
}
}
// Populate array of guns with shotpoint data
for (let shot of data) {
x.push(shot.point);
for (const facet of facets) {
const label = facet.params.name;
const facetGunsArray = z[label];
for (const gun of shot.meta.guns) {
const gunIndex = gun[1]-1;
const facetGun = facetGunsArray[gunIndex];
facetGun.push(facet.conversion(gun, shot));
}
}
}
return aspects.map( (aspect, idx) => {
const facet = facets.find(el => el.params.name == aspect) || {};
const defaultParams = {
name: aspect,
type: "heatmap",
showscale: false,
x,
y,
z: z[aspect],
text: facet.text ? z[aspect].map(row => row.map(v => facet.text[v])) : undefined,
xaxis: "x",
yaxis: "y" + (idx > 0 ? idx+1 : "")
}
return Object.assign({}, defaultParams, facet.params);
});
}
const data = transform(this.data.items, this.aspects);
this.busy = false;
const layout = {
title: {text: "Gun details sequence %{meta.sequence}"},
height: 200*this.aspects.length,
//autocolorscale: true,
/*
grid: {
rows: this.aspects.length,
columns: 1,
pattern: "coupled",
roworder: "bottom to top"
},
*/
//autosize: true,
// colorscale: "sequential",
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
this.aspects.forEach ( (aspect, idx) => {
const num = idx+1;
const key = "yaxis" + num;
const anchor = "y" + num;
const segment = (1/this.aspects.length);
const margin = segment/20;
const domain = [
segment*idx + margin,
segment*num - margin
];
layout[key] = {
title: aspect,
anchor,
domain
}
});
const config = {
//editable: true,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphHeat, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphHeat);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphHeat);
}
}
};
</script>

View File

@@ -0,0 +1,381 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun pressures
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsPressure',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunPressures = guns.map(s => s.map(g => g[11]));
const gunPressuresSorted = gunPressures.map(s => d3a.sort(s));
const gunVolumes = guns.map(s => s.map(g => g[12]));
const gunPressureWeights = gunVolumes.map( (s, sidx) => s.map( v => v/meta[sidx].volume ));
const gunsWeightedAvgPressure = gunPressures.map( (s, sidx) =>
d3a.sum(s.map( (pressure, gidx) => pressure * gunPressureWeights[sidx][gidx] )) / d3a.sum(gunPressureWeights[sidx])
);
const manifold = unpack(meta, "manifold");
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const traceManifold = {
name: "Manifold",
type: "scatter",
mode: "lines",
line: { ...aes.gunArrays[src_number || 1].avg.line, dash: "dot", width: 1 },
x,
y: manifold,
};
const tracesGunPressures = [{
type: "scatter",
mode: "lines",
x,
y: gunPressuresSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsWeightedAvgPressure,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunPressuresSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsPressuresIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunPressuresSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunPressuresSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ traceManifold, ...tracesGunPressures, tracesGunsPressuresIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun pressures sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Pressure (psi)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const pressures = unpack(guns, 11);
const volumes = unpack(guns, 12);
const maxVolume = d3a.max(volumes);
const data = [{
type: "bar",
x: gunIds,
y: pressures,
width: volumes.map( v => v/maxVolume ),
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun pressures shot %{meta.point}"},
height: 300,
yaxis: {
title: "Pressure (psi)",
range: [ Math.min(d3a.min(pressures), 1950), Math.max(d3a.max(pressures), 2050) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 11), // Gun pressure
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun pressures sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Pressure (psi)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,364 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun timing
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsTiming',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunTimings = guns.map(s => s.map(g => g[9]));
const gunTimingsSorted = gunTimings.map(s => d3a.sort(s));
const gunsAvgTiming = gunTimings.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunTimings = [{
type: "scatter",
mode: "lines",
x,
y: gunTimingsSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsAvgTiming,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunTimingsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsTimingsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunTimingsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunTimingsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunTimings, tracesGunsTimingsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun timings sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Timing (ms)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const timings = unpack(guns, 9);
const data = [{
type: "bar",
x: gunIds,
y: timings,
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun timings shot %{meta.point}"},
height: 300,
yaxis: {
title: "Timing (ms)",
range: [ Math.min(d3a.min(timings), 10), Math.max(d3a.max(timings), 20) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 9), // Gun timing
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun timings sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Timing (ms)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,145 @@
<template>
<v-dialog v-model="open">
<template v-slot:activator="{ on, attrs }">
<v-btn icon v-bind="attrs" v-on="on" title="Configure visible aspects">
<v-icon small>mdi-wrench-outline</v-icon>
</v-btn>
</template>
<v-card>
<v-list nav subheader>
<v-subheader>Visualisations</v-subheader>
<v-list-item-group v-model="aspectsVisible" multiple>
<v-list-item value="DougalGraphGunsPressure">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun pressure</v-list-item-title>
<v-list-item-subtitle>Array pressures weighted averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsTiming">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun timing</v-list-item-title>
<v-list-item-subtitle>Array timing averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsDepth">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun depth</v-list-item-title>
<v-list-item-subtitle>Array depths averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsHeatmap">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Heatmap: Gun parameters</v-list-item-title>
<v-list-item-subtitle>Detail of every gun × every shotpoint</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphArraysIJScatter">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: I/J error</v-list-item-title>
<v-list-item-subtitle>Inline / crossline error</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
</v-list-item-group>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn v-if="user" color="warning" text @click="save" :title="'Save as preference for user '+user.name+' on this computer (other users may have other defaults).'">Save as default</v-btn>
<v-spacer></v-spacer>
<v-btn color="primary" text @click="open=false">Close</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: "DougalGraphSettingsSequence",
props: [
"aspects"
],
data () {
return {
open: false,
aspectsVisible: this.aspects || []
}
},
watch: {
aspects () {
// Update the aspects selection list iff the list
// is not currently open.
if (!this.open) {
this.aspectsVisible = this.aspects;
}
}
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
methods: {
save () {
this.open = false;
this.$nextTick( () => this.$emit("update:aspects", {aspects: [...this.aspectsVisible]}) );
},
reset () {
this.aspectsVisible = this.aspects || [];
}
}
}
</script>

View File

@@ -33,7 +33,8 @@
text
:href="`mailto:${email}?Subject=Question`"
>
Ask a question
<v-icon class="d-lg-none">mdi-help-circle</v-icon>
<span class="d-none d-lg-inline">Ask a question</span>
</v-btn>
<v-btn
@@ -41,7 +42,17 @@
text
href="mailto:dougal-support@aaltronav.eu?Subject=Bug report"
>
Report a bug
<v-icon class="d-lg-none">mdi-bug</v-icon>
<span class="d-none d-lg-inline">Report a bug</span>
</v-btn>
<v-btn
color="info"
text
:href='"/feed/"+feed'
title="View development log"
>
<v-icon>mdi-rss</v-icon>
</v-btn>
<v-spacer></v-spacer>
@@ -52,7 +63,8 @@
text
@click="dialog=false"
>
Close
<v-icon class="d-lg-none">mdi-close-circle</v-icon>
<span class="d-none d-lg-inline">Close</span>
</v-btn>
</v-card-actions>
@@ -69,7 +81,8 @@ export default {
data () {
return {
dialog: false,
email: "dougal-support@aaltronav.eu"
email: "dougal-support@aaltronav.eu",
feed: btoa(encodeURIComponent("https://gitlab.com/wgp/dougal/software.atom?feed_token=XSPpvsYEny8YmH75Nz5W"))
};
}

View File

@@ -2,8 +2,8 @@
<div class="line-status" v-if="sequences.length == 0">
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref">
<router-link v-for="sequence in sequences"
<div class="line-status" v-else-if="sequenceHref || plannedSequenceHref || pendingReshootHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence" v-if="sequenceHref"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
@@ -11,15 +11,41 @@
:to="sequenceHref(sequence)"
>
</router-link>
<router-link v-for="sequence in plannedSequences" :key="sequence.sequence" v-if="plannedSequenceHref"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
:to="plannedSequenceHref(sequence)"
>
</router-link>
<router-link v-for="(line, key) in pendingReshoots" :key="key" v-if="pendingReshootHref"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
:to="pendingReshootHref(line)"
>
</router-link>
</div>
<div class="line-status" v-else>
<div v-for="sequence in sequences"
<div v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
:title="title(sequence)"
>
</div>
<div v-for="sequence in plannedSequences" :key="sequence.sequence"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
>
</div>
<div v-for="(line, key) in pendingReshoots" :key="key"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
>
</div>
</div>
</template>
@@ -32,12 +58,12 @@
min-height 16px
background-color #d3d3d314
border-radius 4px
.sequence
flex 1 1 auto
opacity 0.5
border-radius 4px
&.ntbp
background-color red
&.raw
@@ -46,19 +72,27 @@
background-color green
&.online
background-color blue
&.planned
background-color magenta
&.reshoot
background repeating-linear-gradient(-45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px), repeating-linear-gradient(45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px)
</style>
<script>
export default {
name: 'DougalLineStatus',
props: {
preplot: Object,
sequences: Array,
"sequence-href": Function
"sequence-href": Function,
"planned-sequences": Array,
"planned-sequence-href": Function,
"pending-reshoots": Array,
"pending-reshoot-href": Function
},
methods: {
style (s) {
const values = {};
@@ -66,44 +100,50 @@ export default {
? s.fsp_final
: s.status == "ntbp"
? (s.fsp_final || s.fsp)
: s.fsp; /* status == "raw" */
: s.fsp; /* status == "raw" or planned sequence or pending reshoot */
const lsp = s.status == "final"
? s.lsp_final
: s.status == "ntbp"
? (s.lsp_final || s.lsp)
: s.lsp; /* status == "raw" */
: s.lsp; /* status == "raw" or planned sequence or pending reshoot */
const pp0 = Math.min(this.preplot.fsp, this.preplot.lsp);
const pp1 = Math.max(this.preplot.fsp, this.preplot.lsp);
const len = pp1-pp0;
const sp0 = Math.max(Math.min(fsp, lsp), pp0);
const sp1 = Math.min(Math.max(fsp, lsp), pp1);
const left = (sp0-pp0)/len;
const right = 1-((sp1-pp0)/len);
values["margin-left"] = left*100 + "%";
values["margin-right"] = right*100 + "%";
return values;
},
title (s) {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
title (s, type) {
if (s.status || type == "planned") {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: type == "planned"
? "Planned"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
} else if (type == "reshoot") {
return `Pending reshoot (${s.fsp}${s.lsp})${s.remarks? "\n"+s.remarks : ""}`;
}
}
}
}
</script>

View File

@@ -12,17 +12,70 @@
<v-toolbar-title class="mx-2" @click="$router.push('/')" style="cursor: pointer;">Dougal</v-toolbar-title>
<v-spacer></v-spacer>
<v-menu bottom offset-y>
<template v-slot:activator="{on, attrs}">
<v-hover v-slot="{hover}">
<v-btn
class="align-self-center"
:xcolor="hover ? 'secondary' : 'secondary lighten-3'"
small
text
v-bind="attrs"
v-on="on"
title="Settings"
>
<v-icon small>mdi-cog-outline</v-icon>
</v-btn>
</v-hover>
</template>
<v-list dense>
<v-list-item :href="`/settings/equipment`">
<v-list-item-title>Equipment list</v-list-item-title>
<v-list-item-action><v-icon small>mdi-view-list</v-icon></v-list-item-action>
</v-list-item>
</v-list>
</v-menu>
<v-breadcrumbs :items="path"></v-breadcrumbs>
<template v-if="$route.name != 'Login'">
<v-btn text link to="/login" v-if="!$root.user">Log in</v-btn>
<template v-else>
<v-btn title="Edit profile" disabled>
{{$root.user.name}}
</v-btn>
<v-btn class="ml-2" title="Log out" link to="/?logout=1">
<v-icon>mdi-logout</v-icon>
<v-btn text link to="/login" v-if="!user && !loading">Log in</v-btn>
<template v-else-if="user">
<v-menu
offset-y
>
<template v-slot:activator="{on, attrs}">
<v-avatar :color="user.colour || 'primary'" :title="`${user.name} (${user.role})`" v-bind="attrs" v-on="on">
<span class="white--text">{{user.name.slice(0, 5)}}</span>
</v-avatar>
</template>
<v-list dense>
<v-list-item link to="/login" v-if="user.autologin">
<v-list-item-icon><v-icon small>mdi-login</v-icon></v-list-item-icon>
<v-list-item-content>
<v-list-item-title>Log in as a different user</v-list-item-title>
<v-list-item-subtitle>Autologin from {{user.ip}}</v-list-item-subtitle>
</v-list-item-content>
</v-list-item>
<v-list-item link to="/logout" v-else>
<v-list-item-icon><v-icon small>mdi-logout</v-icon></v-list-item-icon>
<v-list-item-title>Log out</v-list-item-title>
</v-list-item>
</v-list>
</v-menu>
<!--
<v-btn small text class="ml-2" title="Log out" link to="/?logout=1">
<v-icon small>mdi-logout</v-icon>
</v-btn>
-->
</template>
</template>
<template v-slot:extension v-if="$route.matched.find(i => i.name == 'Project')">
@@ -35,6 +88,7 @@
</template>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: 'DougalNavigation',
@@ -49,6 +103,7 @@ export default {
{ href: "calendar", text: "Calendar" },
{ href: "log", text: "Log" },
{ href: "qc", text: "QC" },
{ href: "graphs", text: "Graphs" },
{ href: "map", text: "Map" }
],
path: []
@@ -58,7 +113,9 @@ export default {
computed: {
tab () {
return this.tabs.findIndex(t => t.href == this.$route.path.split(/\/+/)[3]);
}
},
...mapGetters(['user', 'loading'])
},
watch: {

View File

@@ -0,0 +1,135 @@
<template>
<v-hover v-slot:default="{hover}" v-if="!isEmpty(item)">
<span>
<v-btn v-if="!isAccepted(item)"
:class="{'text--disabled': !hover}"
icon
small
color="primary"
:title="isMultiple(item) ? 'Accept all' : 'Accept'"
@click.stop="accept(item)">
<v-icon small :color="isAccepted(item) ? 'green' : ''">
{{ isMultiple(item) ? 'mdi-check-all' : 'mdi-check' }}
</v-icon>
</v-btn>
<v-btn v-if="someAccepted(item)"
:class="{'text--disabled': !hover}"
icon
small
color="primary"
:title="isMultiple(item) ? 'Restore all' : 'Restore'"
@click.stop="unaccept(item)">
<v-icon small>
{{ isMultiple(item) ? 'mdi-restore' : 'mdi-restore' }}
</v-icon>
</v-btn>
</span>
</v-hover>
</template>
<script>
export default {
name: 'DougalQcAcceptance',
props: {
item: { type: Object }
},
methods: {
isAccepted (item) {
if (item._children) {
return item._children.every(child => this.isAccepted(child));
}
if (item.labels) {
return item.labels.includes("QCAccepted");
}
return false;
},
someAccepted (item) {
if (item._children) {
return item._children.some(child => this.someAccepted(child));
}
if (item.labels) {
return item.labels.includes("QCAccepted");
}
return false;
},
isEmpty (item) {
return item._children?.length === 0;
},
isMultiple (item) {
return item._children?.length;
},
action (action, item) {
const items = [];
const iterate = (item) => {
if (item._kind == "point") {
if (this.isAccepted(item)) {
if (action == "unaccept") {
items.push(item);
}
} else {
if (action == "accept") {
items.push(item);
}
}
} else if (item._kind == "sequence" || item._kind == "test") {
if (item._children) {
for (const child of item._children) {
iterate(child);
}
}
if (item._shots) {
for (const child of item._children) {
iterate(child);
}
}
}
}
iterate(item);
return items;
},
accept (item) {
const items = this.action('accept', item);
if (items.length) {
this.$emit('accept', items);
}
},
unaccept (item) {
const items = this.action('unaccept', item);
if (items.length) {
this.$emit('unaccept', items);
}
}
}
}
</script>

View File

@@ -1,5 +1,5 @@
export default function FormatTimestamp (str) {
const d = new Date(str);
if (isNaN(d)) {

View File

@@ -0,0 +1,88 @@
export const gunArrays = {
1: {
min: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.3)", shape: "spline"},
showlegend: false,
name: "Array 1 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.9)", shape: "spline"},
name: "Array 1 (avg.)"
},
max: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.4)", shape: "spline"},
showlegend: false,
name: "Array 1 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 1 outliers",
line: {color: "rgba(129, 199, 166, 0.7)"},
fillcolor: "rgba(129, 199, 166, 0.5)"
}
},
2: {
min: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.3)", shape: "spline"},
showlegend: false,
name: "Array 2 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.9)", shape: "spline"},
name: "Array 2 (avg.)"
},
max: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.4)", shape: "spline"},
showlegend: false,
name: "Array 2 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 2 outliers",
line: {color: "rgba(229, 153, 115, 0.7)"},
fillcolor: "rgba(229, 153, 115, 0.5)"
}
},
3: {
min: {
fillcolor: "",
line: {color: "", shape: "spline"},
showlegend: false,
name: "Array 3 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "",
line: {color: "", shape: "spline"},
name: "Array 3 (avg.)"
},
max: {
fillcolor: "",
line: {color: "", shape: "spline"},
showlegend: false,
name: "Array 3 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 3 outliers",
//fillcolor: ""
}
}
};
export const gunArrayViolins = {
1: {
value: {line: {color: "rgba(129, 199, 132, 0.9)"}}
},
2: {
value: {line: {color: "rgba(229, 115, 115, 0.9)"}}
}
};

View File

@@ -0,0 +1,11 @@
const marked = require('marked');
function markdown (str) {
return marked(String(str));
}
function markdownInline (str) {
return marked.parseInline(String(str));
}
module.exports = { markdown, markdownInline };

View File

@@ -0,0 +1,33 @@
/**
* Throttle a function call.
*
* It delays `callback` by `delay` ms and ignores any
* repeated calls from `caller` within at most `maxWait`
* milliseconds.
*
* Used to react to server events in cases where we get
* a separate notification for each row of a bulk update.
*/
function throttle (callback, caller, delay = 100, maxWait = 500) {
const schedule = async () => {
caller.triggeredAt = Date.now();
caller.timer = setTimeout(async () => {
await callback();
caller.timer = null;
}, delay);
}
if (!caller.timer) {
schedule();
} else {
const elapsed = Date.now() - caller.triggeredAt;
if (elapsed > maxWait) {
cancelTimeout(caller.timer);
schedule();
}
}
}
export default throttle;

View File

@@ -0,0 +1,4 @@
export default function unpack(rows, key) {
return rows && rows.map( row => row[key] );
};

View File

@@ -1,4 +1,4 @@
function withParentProps(item, parent, childrenKey, prop, currentValue) {
if (!Array.isArray(parent)) {
@@ -26,4 +26,116 @@ function withParentProps(item, parent, childrenKey, prop, currentValue) {
return [];
}
export { withParentProps }
function dms (lat, lon) {
const λh = lat < 0 ? "S" : "N";
const φh = lon < 0 ? "W" : "E";
const λn = Math.abs(lat);
const φn = Math.abs(lon);
const λi = Math.trunc(λn);
const φi = Math.trunc(φn);
const λf = λn - λi;
const φf = φn - φi;
const λs = ((λf*3600)%60).toFixed(1);
const φs = ((φf*3600)%60).toFixed(1);
const λm = Math.trunc(λf*60);
const φm = Math.trunc(φf*60);
const λ =
String(λi).padStart(2, "0") + "°" +
String(λm).padStart(2, "0") + "'" +
String(λs).padStart(4, "0") + '" ' +
λh;
const φ =
String(φi).padStart(3, "0") + "°" +
String(φm).padStart(2, "0") + "'" +
String(φs).padStart(4, "0") + '" ' +
φh;
return λ+" "+φ;
}
function geometryAsString (item, opts = {}) {
const key = "key" in opts ? opts.key : "geometry";
const formatDMS = opts.dms;
let str = "";
if (key in item) {
const geometry = item[key];
if (geometry && "coordinates" in geometry) {
if (geometry.type == "Point") {
if (formatDMS) {
str = dms(geometry.coordinates[1], geometry.coordinates[0]);
} else {
str = `${geometry.coordinates[1].toFixed(6)}, ${geometry.coordinates[0].toFixed(6)}`;
}
}
if (str) {
if (opts.url) {
if (typeof opts.url === 'string') {
str = `[${str}](${opts.url.replace("$x", geometry.coordinates[0]).replace("$y", geometry.coordinates[1])})`;
} else {
str = `[${str}](geo:${geometry.coordinates[0]},${geometry.coordinates[1]})`;
}
}
}
}
}
return str;
}
/** Extract preferences by prefix.
*
* This function returns a lambda which, given
* a key or a prefix, extracts the relevant
* preferences from the designated preferences
* store.
*
* For instance, assume preferences = {
* "a.b.c.d": 1,
* "a.b.e.f": 2,
* "g.h": 3
* }
*
* And λ = preferencesλ(preferences). Then:
*
* λ("a.b") → { "a.b.c.d": 1, "a.b.e.f": 2 }
* λ("a.b.e.f") → { "a.b.e.f": 2 }
* λ("g.x", {"g.x.": 99}) → { "g.x.": 99 }
* λ("a.c", {"g.x.": 99}) → { "g.x.": 99 }
*
* Note from the last two examples that a default value
* may be provided and will be returned if a key does
* not exist or is not searched for.
*/
function preferencesλ (preferences) {
return function (key, defaults={}) {
const keys = Object.keys(preferences).filter(str => str.startsWith(key+".") || str == key);
const settings = {...defaults};
for (const str of keys) {
const k = str == key ? str : str.substring(key.length+1);
const v = preferences[str];
settings[k] = v;
}
return settings;
}
}
export {
withParentProps,
geometryAsString,
preferencesλ
}

View File

@@ -5,12 +5,22 @@ import store from './store'
import vuetify from './plugins/vuetify'
import vueDebounce from 'vue-debounce'
import { mapMutations } from 'vuex';
import { markdown, markdownInline } from './lib/markdown';
import { geometryAsString } from './lib/utils';
Vue.config.productionTip = false
Vue.use(vueDebounce);
Vue.filter('markdown', markdown);
Vue.filter('markdownInline', markdownInline);
Vue.filter('position', (str, item, opts) =>
str
.replace(/@POS(ITION)?@/g, geometryAsString(item, opts) || "(position unknown)")
.replace(/@DMS@/g, geometryAsString(item, {...opts, dms:true}) || "(position unknown)")
);
// Vue.filter('position', (str, item, opts) => str.replace(/@POS(ITION)?@/, "☺"));
new Vue({
data () {
return {

View File

@@ -1,6 +1,8 @@
import Vue from 'vue'
import VueRouter from 'vue-router'
import Home from '../views/Home.vue'
import Login from '../views/Login.vue'
import Logout from '../views/Logout.vue'
import Project from '../views/Project.vue'
import ProjectList from '../views/ProjectList.vue'
import ProjectSummary from '../views/ProjectSummary.vue'
@@ -12,6 +14,7 @@ import SequenceSummary from '../views/SequenceSummary.vue'
import Calendar from '../views/Calendar.vue'
import Log from '../views/Log.vue'
import QC from '../views/QC.vue'
import Graphs from '../views/Graphs.vue'
import Map from '../views/Map.vue'
@@ -31,6 +34,40 @@ Vue.use(VueRouter)
// which is lazy-loaded when the route is visited.
component: () => import(/* webpackChunkName: "about" */ '../views/About.vue')
},
{
path: '/feed/:source',
name: 'Feed',
// route level code-splitting
// this generates a separate chunk (about.[hash].js) for this route
// which is lazy-loaded when the route is visited.
component: () => import(/* webpackChunkName: "about" */ '../views/Feed.vue')
},
{
path: "/settings/equipment",
name: "equipment",
component: () => import(/* webpackChunkName: "about" */ '../views/Equipment.vue')
},
{
pathToRegexpOptions: { strict: true },
path: "/login",
redirect: "/login/"
},
{
pathToRegexpOptions: { strict: true },
name: "Login",
path: "/login/",
component: Login,
meta: {
// breadcrumbs: [
// { text: "Projects", href: "/projects", disabled: true }
// ]
}
},
{
// pathToRegexpOptions: { strict: true },
path: "/logout",
component: Logout,
},
{
pathToRegexpOptions: { strict: true },
path: "/projects",
@@ -114,8 +151,19 @@ Vue.use(VueRouter)
path: "qc",
component: QC
},
{
path: "graphs",
component: Graphs,
children: [
{ path: "sequence/:sequence", name: "graphsBySequence" },
{ path: "sequence/:sequence0/:sequence1", name: "graphsBySequences" },
{ path: "date/:date0", name: "graphsByDate" },
{ path: "date/:date0/:date1", name: "graphsByDates" }
]
},
{
path: "map",
name: "map",
component: Map
}
]

View File

@@ -2,8 +2,14 @@ import Vue from 'vue'
import Vuex from 'vuex'
import api from './modules/api'
import user from './modules/user'
import snack from './modules/snack'
import project from './modules/project'
import event from './modules/event'
import label from './modules/label'
import sequence from './modules/sequence'
import plan from './modules/plan'
import line from './modules/line'
import notify from './modules/notify'
Vue.use(Vuex)
@@ -11,8 +17,14 @@ Vue.use(Vuex)
export default new Vuex.Store({
modules: {
api,
user,
snack,
project,
event,
label,
sequence,
plan,
line,
notify
}
})

View File

@@ -13,13 +13,17 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
init.body = JSON.stringify(init.body);
}
}
const res = await fetch(`${state.apiUrl}${resource}`, init);
const url = /^https?:\/\//i.test(resource) ? resource : (state.apiUrl + resource);
const res = await fetch(url, init);
if (typeof cb === 'function') {
cb(null, res);
await cb(null, res);
}
if (res.ok) {
await dispatch('setCredentials');
try {
return await res.json();
return init.text ? (await res.text()) : (await res.json());
} catch (err) {
if (err instanceof SyntaxError) {
if (Number(res.headers.get("Content-Length")) === 0) {
@@ -31,7 +35,14 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
throw err;
}
} else {
await dispatch('showSnack', [res.statusText, "warning"]);
let message = res.statusText;
if (res.headers.get("Content-Type").match(/^application\/json/i)) {
const body = await res.json();
if (body.message) {
message = body.message;
}
}
await dispatch('showSnack', [message, "warning"]);
}
} catch (err) {
if (err && err.name == "AbortError") return;

View File

@@ -0,0 +1,129 @@
/** Fetch events from server
*/
async function refreshEvents ({commit, dispatch, state, rootState}, [modifiedAfter] = []) {
if (!modifiedAfter) {
modifiedAfter = state.timestamp;
}
if (state.loading) {
commit('abortEventsLoading');
}
commit('setEventsLoading');
const pid = rootState.project.projectId;
const url = modifiedAfter
? `/project/${pid}/event/changes/${(new Date(modifiedAfter)).toISOString()}?unique=t`
: `/project/${pid}/event`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
if (modifiedAfter) {
commit('setModifiedEvents', res);
} else {
commit('setEvents', res);
}
commit('setEventsTimestamp');
}
commit('clearEventsLoading');
}
/** Return a subset of events from state.events
*/
async function getEvents ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text, label}]) {
let filteredEvents = [...state.events];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredEvents.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredEvents.reverse();
}
});
}
if (sequence) {
filteredEvents = filteredEvents.filter( event => event.sequence == sequence );
}
if (date0 && date1) {
filteredEvents = filteredEvents.filter( event =>
event.tstamp.substr(0, 10) >= date0 && event.tstamp.substr(0, 10) <= date1
);
} else if (date0) {
filteredEvents = filteredEvents.filter( event => event.tstamp.substr(0, 10) == date0 );
}
if (text) {
const tstampFilter = (value, search, item) => {
return textFilter(value, search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
tstamp: tstampFilter,
sequence: numberFilter,
point: numberFilter,
remarks: textFilter,
labels: (value, search, item) => value.some(label => textFilter(label, search, item))
};
filteredEvents = filteredEvents.filter ( event => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(event[key], text, event)) {
return true;
}
}
return false;
});
}
if (label) {
filteredEvents = filteredEvents.filter( event => event.labels?.includes(label) );
}
const count = filteredEvents.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredEvents = filteredEvents.slice(offset, offset+itemsPerPage);
}
return {events: filteredEvents, count};
}
export default { refreshEvents, getEvents };

Some files were not shown because too many files have changed in this diff Show More