Compare commits

..

194 Commits

Author SHA1 Message Date
D. Berge
4c2a2617a1 Adapt Project component to Vuex use for fetching data.
The Project component is now responsible for fetching and
updating the data used by most project tabs, with the
exception of ProjectSummary, QC, Graphs and Map. It is
also the only one listening for server events and reacting
to them.

Individual tabs are still responsible for sending data to
the server, at least for the time being.
2023-10-25 16:19:18 +02:00
D. Berge
5021888d03 Adapt Log component to Vuex use for fetching data 2023-10-25 16:18:41 +02:00
D. Berge
bf633f7fdf Refactor Calendar component.
- adapts it to Vuex use for fetching data
- displays extra events in 4-day and day views
- allows classifying by event label in 4-day and day views
2023-10-25 16:16:01 +02:00
D. Berge
847f49ad7c Adapt SequenceList component to Vuex use for fetching data 2023-10-25 16:15:17 +02:00
D. Berge
171feb9dd2 Adapt Plan component to Vuex use for fetching data 2023-10-25 16:14:45 +02:00
D. Berge
503a0de12f Adapt LineList component to Vuex use for fetching data 2023-10-25 16:13:56 +02:00
D. Berge
cf89a43f64 Add project configuration to Vuex store 2023-10-25 16:11:24 +02:00
D. Berge
680e376ed1 Add Vuex sequence module 2023-10-25 16:11:24 +02:00
D. Berge
a26974670a Add Vuex plan module 2023-10-25 16:11:24 +02:00
D. Berge
16a6cb59dc Add Vuex line module 2023-10-25 16:11:24 +02:00
D. Berge
829e206831 Add Vuex label module 2023-10-25 09:59:04 +02:00
D. Berge
83244fcd1a Add Vuex event module 2023-10-25 09:51:28 +02:00
D. Berge
851369a0b4 Invalidate planner endpoint cache when setting remarks 2023-10-23 14:58:41 +02:00
D. Berge
5065d62443 Update planner endpoint documentation 2023-10-23 14:57:27 +02:00
D. Berge
2d1e1e9532 Modify return payload of planner endpoint.
Previous:

[
  { sequence: …},
  { sequence: …},
  …
]

Current:

{
  remarks: "…",
  sequences: [
    { sequence: …},
    { sequence: …},
    …
  ]
}
2023-10-23 14:53:32 +02:00
D. Berge
051049581a Merge branch '278-rewrite-events-queue' into 'devel'
Resolve "Rewrite events queue"

Closes #278

See merge request wgp/dougal/software!46
2023-10-17 10:28:21 +00:00
D. Berge
da5ae18b0b Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into devel 2023-10-17 12:27:31 +02:00
D. Berge
ac9353c101 Add database upgrade file 31. 2023-10-17 12:27:06 +02:00
D. Berge
c4c5c44bf1 Add comment 2023-10-17 12:20:19 +02:00
D. Berge
d3659ebf02 Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into 'devel'
Resolve "Support requesting a partial update from the events log endpoint"

Closes #269

See merge request wgp/dougal/software!47
2023-10-17 10:18:41 +00:00
D. Berge
6b5070e634 Add event changes API endpoint description 2023-10-17 12:15:41 +02:00
D. Berge
09ff96ceee Add events change API endpoint 2023-10-17 11:15:36 +02:00
D. Berge
f231acf109 Add events change middleware 2023-10-17 11:15:06 +02:00
D. Berge
e576e1662c Add library function returning event changes after given epoch 2023-10-17 11:13:58 +02:00
D. Berge
6a21ddd1cd Rewrite events listener and handlers.
The events listener now uses a proper self-consuming queue and
the event handlers have been rewritten accordingly.

The way this works is that running init() on the handlers
library instantiates the handlers and returns two higher-order
functions, prepare() and despatch(). A call to the latter of
these is appended to the queue with each new incoming event.

The handlers have access to a context object (ctx) which may be
used to persist data between calls and/or exchange data between
handlers. This is used notably to give the handlers access to
project configurations, which are themselves refreshed by a
project configuration change handler (DetectProjectConfigurationChange).
2023-10-14 20:53:42 +02:00
D. Berge
c1e35b2459 Cache project configuration details.
This avoids requesting the project configurations on every single
incoming message. A listener refreshes the data on configuration
changes.
2023-10-14 20:11:18 +02:00
D. Berge
eee2a96029 Modify logging statements 2023-10-14 20:10:46 +02:00
D. Berge
6f5e5a4d20 Fix bug for shortcut when there is only one candidate project 2023-10-14 20:09:07 +02:00
D. Berge
9e73cb7e00 Clean up on SIGINT, SIGHUP signals 2023-10-14 20:07:19 +02:00
D. Berge
d7ab4eec7c Run some tasks periodically from the main process.
This reduces reliance on crontab jobs.
2023-10-14 20:06:38 +02:00
D. Berge
cdd96a4bc7 Don't bother trying to kill the child process on exit.
As the exit signal handler does not allow asynchronous tasks and
besides, killing the parent should kill its children too.
2023-10-14 20:02:54 +02:00
D. Berge
39a21766b6 Exit on start up errors 2023-10-14 20:02:04 +02:00
D. Berge
0e33c18b5c Replace console.log() with debug library calls 2023-10-14 19:57:57 +02:00
D. Berge
7f411ac7dd Add queue libraries.
A basic queue implementation and one that consumes its items
automatically until empty.
2023-10-14 19:56:56 +02:00
D. Berge
ed1da11c9d Add helper function to purge notifications 2023-10-14 19:54:34 +02:00
D. Berge
66ec28dd83 Refactor DB notifications listener to support large payloads.
The listener will automatically retrieve the full payload
before passing it on to event handlers.
2023-10-14 18:33:41 +02:00
D. Berge
b928d96774 Add database upgrade file 30. 2023-10-14 18:29:28 +02:00
D. Berge
73335f9c1e Merge branch '136-add-line-change-time-log-pseudoevent' into 'devel'
Resolve "Add line change time log pseudoevent"

Closes #136

See merge request wgp/dougal/software!45
2023-10-04 12:50:49 +00:00
D. Berge
7b6b81dbc5 Add more debugging statements 2023-10-04 14:50:12 +02:00
D. Berge
2e11c574c2 Throw rather than return.
Otherwise the finally {} block won't run.
2023-10-04 14:49:35 +02:00
D. Berge
d07565807c Do not retry immediately 2023-10-04 14:49:09 +02:00
D. Berge
6eccbf215a There should be no need to await.
That is because the queue handler will, in theory, only ever
process one event at a time.
2023-09-30 21:29:15 +02:00
D. Berge
8abc05f04e Remove dead code 2023-09-30 21:29:15 +02:00
D. Berge
8f587467f9 Add comment 2023-09-30 21:29:15 +02:00
D. Berge
3d7a91c7ff Rewrite ReportLineChangeTime 2023-09-30 21:29:15 +02:00
D. Berge
3fd408074c Support passing array in opts.sequences to event.list() 2023-09-30 21:29:15 +02:00
D. Berge
f71cbd8f51 Add unique utility function 2023-09-30 21:29:15 +02:00
D. Berge
915df8ac16 Add handler for creation of line change time events 2023-09-30 21:29:15 +02:00
D. Berge
d5ecb08a2d Allow switching to event entry by time.
A ‘Timed’ button is shown when a new (not edited) event is in
the event entry dialogue and the event has sequence and/or
point values. Pressing the button deletes the sequence/point
information and sets the date and time fields to current time.

Fixes #277.
2023-09-30 21:26:32 +02:00
D. Berge
9388cd4861 Make daily_tasks work with new project configuration 2023-09-30 20:36:46 +02:00
D. Berge
180590b411 Mark events as being automatically generated 2023-09-30 01:42:27 +02:00
D. Berge
4ec37539bf Add utils to work with Postgres ranges 2023-09-30 01:41:45 +02:00
D. Berge
8755fe01b6 Refactor events.list.
The SQL has been simplified and the following changes made:

- The `sequence` argument now can only take one individual
  sequence, not a list of sequences.
- A new `sequences` argument is recognised. It takes a list
  of sequences (as a string).
- A new `label` argument is recognised. It takes a label
  name and returns events containing that label.
- A new `jpq` argument is recognised. It takes a JSONPath
  string which is applied to `meta` with jsonb_path_exists(),
  returning any events for which the JSON path expression
  matches.
2023-09-30 01:37:22 +02:00
D. Berge
0bfe54e0c2 Include the meta attribute when posting events 2023-09-30 01:36:18 +02:00
D. Berge
29bc689b84 Merge branch '276-add-soft-start-event-detection' into 'devel'
Resolve "Add soft start event detection"

Closes #276

See merge request wgp/dougal/software!44
2023-09-29 15:02:57 +00:00
D. Berge
65682febc7 Add soft start and full volume events detection 2023-09-29 17:02:03 +02:00
D. Berge
d408665d62 Write meta info to automatic events 2023-09-29 16:49:27 +02:00
D. Berge
64fceb0a01 Merge branch '127-sol-eol-events-not-being-inserted-in-the-log-automatically' into 'devel'
Resolve "SOL / EOL events not being inserted in the log automatically"

Closes #127

See merge request wgp/dougal/software!43
2023-09-29 14:17:46 +00:00
D. Berge
ab58e578c9 Use DEBUG library throughout 2023-09-29 16:16:33 +02:00
D. Berge
0e58b8fa5b Refactor code to identify candidate schemas.
As part of the refactoring, we took into account a slight payload
format change (project configuration details are under the `data`
attribute).
2023-09-29 16:13:35 +02:00
D. Berge
99ac082f00 Use common naming convention both online and offline 2023-09-29 16:11:44 +02:00
D. Berge
4d3fddc051 Merge branch '274-use-new-db-event-notifier-for-event-processing-handlers' into 'devel'
Resolve "Use new DB event notifier for event processing handlers"

Closes #275, #230, and #274

See merge request wgp/dougal/software!42
2023-09-29 14:03:00 +00:00
D. Berge
42456439a9 Remove ad-hoc notifier 2023-09-29 15:59:12 +02:00
D. Berge
ee0c0e7308 Replace ad-hoc notifier with pg-listen based version 2023-09-29 15:59:12 +02:00
D. Berge
998c272bf8 Add var/* to .gitignore 2023-09-29 15:59:12 +02:00
D. Berge
daddd1f0e8 Add script to rewrite packet captures IP and MAC addresses.
Closes #230.
2023-09-29 15:58:59 +02:00
D. Berge
17f20535cb Cope with fragmented UDP packets.
Fixes #275.

Use this as the systemd unit file to run as a service:

[Unit]
Description=Dougal Network Packet Capture
After=network.target remote-fs.target nss-lookup.target

[Service]
ExecStart=/srv/dougal/software/sbin/packet-capture.sh
ExecStop=/bin/kill -s QUIT $MAINPID
Restart=always
User=root
Group=users
Environment=PATH=/usr/bin:/usr/sbin:/usr/local/bin
Environment=INS_HOST=172.31.10.254
WorkingDirectory=/srv/dougal/software/var/
SyslogIdentifier=dougal.pcap

[Install]
WantedBy=multi-user.target
2023-09-29 15:28:11 +02:00
D. Berge
0829ea3ea1 Save a copy of the headers not the original.
Otherwise ExpressJS will complain about trying to modify
headers that have already been sent.
2023-09-24 12:17:16 +02:00
D. Berge
2069d9c3d7 Remove dead code 2023-09-24 12:15:06 +02:00
D. Berge
8a2d526c50 Ignore schema attribute in PATCH payload.
Fixes #273.
2023-09-24 12:14:20 +02:00
D. Berge
8ad96d6f73 Ensure that requiredFields is always defined.
Otherwise, `Object.entries(requiredFields)` may fail.
2023-09-24 11:59:26 +02:00
D. Berge
947faf8c05 Provide default glob specification for map layer imports 2023-09-24 11:34:10 +02:00
D. Berge
a948556455 Fail gracefully if map layer data does not exist.
Fixes #272.
2023-09-24 11:33:32 +02:00
D. Berge
835384b730 Apply path conversion to QC definition files 2023-09-23 22:50:09 +02:00
D. Berge
c5b93794f4 Move path conversion to general utilities 2023-09-23 13:44:53 +02:00
D. Berge
056cd32f0e Merge branch '271-qc-results-not-being-refreshed' into 'devel'
Resolve "QC results not being refreshed"

Closes #271

See merge request wgp/dougal/software!41
2023-09-18 10:08:35 +00:00
D. Berge
49bb413110 Merge branch '270-real-time-interface-stopped-working' into 'devel'
Resolve "Real-time interface stopped working"

Closes #270

See merge request wgp/dougal/software!40
2023-09-18 10:08:27 +00:00
D. Berge
ceccc42050 Don't cache response ETags for QC endpoints 2023-09-18 12:06:38 +02:00
D. Berge
aa3379e1c6 Adapt RTI save function to refactored project configuration in DB 2023-09-18 11:58:55 +02:00
D. Berge
4063af0e25 Merge branch '268-inline-crossline-errors-no-longer-being-calculated' into 'devel'
Resolve "Inline/crossline errors no longer being calculated"

Closes #268

See merge request wgp/dougal/software!39
2023-09-15 18:03:51 +00:00
D. Berge
d53e6060a4 Update database templates to v0.4.2 2023-09-15 20:01:54 +02:00
D. Berge
85d8fc8cc0 Update required database version 2023-09-15 14:22:22 +02:00
D. Berge
0fe40b1839 Add missing require 2023-09-15 14:22:02 +02:00
D. Berge
21de4b757f Add database upgrade file 29. 2023-09-15 12:52:42 +02:00
D. Berge
96cdbb2cff Add database upgrade file 28. 2023-09-15 12:52:27 +02:00
D. Berge
d531643b58 Add database upgrade file 27. 2023-09-15 12:52:06 +02:00
D. Berge
a1779ef488 Do not cache /navdata endpoint responses 2023-09-14 13:20:16 +02:00
D. Berge
5239dece1e Do not cache GIS endpoint responses 2023-09-14 13:19:57 +02:00
D. Berge
a7d7837816 Allow only admins to patch project configurations 2023-09-14 13:19:16 +02:00
D. Berge
ebcfc7df47 Allow everyone to access project configuration.
This is necessary as it is requested by various parts of the
frontend.

Consider more granular access control.
2023-09-14 13:17:28 +02:00
D. Berge
dc4b9002fe Adapt QC endpoints to new configuration APIs 2023-09-14 13:15:59 +02:00
D. Berge
33618b6b82 Do not cache Set-Cookie headers 2023-09-14 13:13:47 +02:00
D. Berge
597d407acc Adapt QC view to new label payload from API 2023-09-14 13:13:18 +02:00
D. Berge
6162a5bdee Stop importing P1/90s until scripts are upgraded.
See #266.
2023-09-14 13:09:38 +02:00
D. Berge
696bbf7a17 Take etc/config.yaml out of revision control.
This file contains site-specific configuration. Instead, an
example config.yaml is now provided.
2023-09-14 13:07:33 +02:00
D. Berge
821fcf0922 Add wx forecast info to plan (experiment).
Use https://open-meteo.com/ as a weather forecast provider.

This code is intended for demonstration only, not for
production purposes.

(issue #157)


(cherry picked from commit cc4bce1356)
2023-09-13 20:04:15 +00:00
D. Berge
b1712d838f Merge branch '245-export-event-log-as-csv' into 'devel'
Resolve "Export event log as CSV"

Closes #245

See merge request wgp/dougal/software!38
2023-09-13 20:02:07 +00:00
D. Berge
895b865505 Expose CSV output option in user interface 2023-09-13 21:59:57 +02:00
D. Berge
5a2af5c49e Add CSV output option for events log 2023-09-13 21:58:06 +02:00
D. Berge
24658f4017 Allow patching project name if no name is already set 2023-09-13 16:13:43 +02:00
D. Berge
6707cda75e Ignore case when patching configuration ID 2023-09-13 16:13:12 +02:00
D. Berge
1302a31b3d Improve formatting of layer alert 2023-09-13 13:00:19 +02:00
D. Berge
871a1e8f3a Don't show alert if layer is empty (but log to console) 2023-09-13 12:59:47 +02:00
D. Berge
04e1144bab Simplify expression 2023-09-13 12:59:24 +02:00
D. Berge
6312d94f3e Add support for user layer tooltips and popups 2023-09-13 12:58:44 +02:00
D. Berge
ed91026319 Add tolltip and popup options to map layer configuration.
- `tooltip` takes the name of a GeoJSON property that will be
  shown in a tooltip when hovering the mouse over a feature.

- `popup` can take either the name of a property as above, or
  the boolean value `true`. In the latter case, a table of all
  the feature's properties will be shown when clicking on the
  feature. In the former case, only the value of the designated
  property will be shown.
2023-09-13 12:55:37 +02:00
D. Berge
441a4e296d Import map layers from the runner 2023-09-13 11:24:04 +02:00
D. Berge
c33c3f61df Alert the user if a map layer is too big 2023-09-13 11:22:49 +02:00
D. Berge
2cc293b724 Do not fail trying to restore state for non-existing layers 2023-09-13 11:22:05 +02:00
D. Berge
ee129b2faa Merge branch '114-allow-users-to-show-arbitrary-geojson-on-the-map' into 'devel'
Resolve "Allow users to show arbitrary GeoJSON on the map."

Closes #114

See merge request wgp/dougal/software!37
2023-09-12 17:34:51 +00:00
D. Berge
98d9b3b093 Adapt Map view to new label payload from API 2023-09-12 19:31:58 +02:00
D. Berge
57b9b420f8 Show an error if a layer is too large.
The map view limits the size of layers (both user and regular) in
order to keep the system responsive, as Leaflet is not great at
handling large layers.
2023-09-12 19:29:02 +02:00
D. Berge
9e73f2603a Implement user layers on map view.
The user layers are defined in the project configuration under
`imports.map.layers`.

Multiple layers may be defined and each layer may consist of one
or more GeoJSON files. Files are retrieved via the /files/ API
endpoint.
2023-09-12 19:29:02 +02:00
D. Berge
707889be42 Refactor layer API endpoint and database functions.
- A single get() function is used both to list all available
  layers, if no layer name is given, or a single layer.
- The database no longer holds the actual layer contents,
  only the path to the layer file(s), so the list() function
  is now redundant as we return the full payload in every case.
- The /gis/layer and /gis/layer/:name endpoints now have the same
  payload structure.
2023-09-12 19:29:02 +02:00
D. Berge
f9a70e0145 Refactor map layer importer.
- Now a layer may consist of a path pointing to a directory plus a
  glob, or a path pointing directly to a single file.
- If a file already exists in the database, check if the layer
  name has changed and if so, update it.
- Do not import the actual file contents, as the path is enough
  (it can be retrieved via the /file/:path API endpoint).
2023-09-12 11:05:10 +02:00
D. Berge
b71489cee1 Add get_file_data() function to datastore 2023-09-12 11:04:37 +02:00
D. Berge
0a9bde5f10 Add Background layer to map.
This is a limited implementation of layer backgrounds. The API
supports an arbitrary number of arbitrarily named background
layers, but for the time being we only recognise one background
layer named `Background` and of GeoJSON type.

Certain properties, such a colour/color, opacity, etc., are
recognised and applied as feature styles. If not, a default
style is used.
2023-09-11 10:17:10 +02:00
D. Berge
36d5862375 Add map layer middleware and API endpoints 2023-09-11 10:15:19 +02:00
D. Berge
398c702004 Add map layer functions to database interface 2023-09-11 10:12:46 +02:00
D. Berge
b2d1798338 Add map layer importer 2023-09-11 10:00:59 +02:00
D. Berge
4f165b0c83 Revert behaviour of new jwt-express version.
Fixes breakage introduced in commit
cd00f8b995.
2023-09-10 14:09:01 +02:00
D. Berge
2c86944a51 Merge branch '262-preset-remarks-and-labels-no-longer-working-with-api-0-4-0' into 'devel'
Resolve "Preset remarks and labels no longer working with API 0.4.0"

Closes #262

See merge request wgp/dougal/software!36
2023-09-10 10:10:22 +00:00
D. Berge
5fc51de7d8 Adapt Log view to new configuration endpoint in the API 2023-09-10 12:01:59 +02:00
D. Berge
158e0fb788 Adapt Log view to new label payload from API 2023-09-10 12:01:30 +02:00
D. Berge
941d15c1bc Return labels directly from project configuration.
NOTE: This is a breaking API change. Before this it returned an
array of labels, now it returns an object.
2023-09-10 11:59:38 +02:00
D. Berge
cd00f8b995 Breaking-change Node package udpates (server) 2023-09-10 11:49:56 +02:00
D. Berge
44515f8e78 Non-breaking Node package updates (server) 2023-09-09 20:54:04 +02:00
D. Berge
54fbc76da5 Merge branch '261-wrong-missing-shots-value-in-sequence-summary' into 'devel'
Resolve "Wrong missing shots value in sequence summary"

Closes #261

See merge request wgp/dougal/software!35
2023-09-09 18:46:33 +00:00
D. Berge
c1b5196134 Update database templates to v0.3.12.
Incorporates fix for bug #261.
2023-09-09 20:45:11 +02:00
D. Berge
fb3d3be546 Trailing slash in API call results in "unauthorised" error.
No idea why.
2023-09-09 20:39:49 +02:00
D. Berge
8e11e242ed Remove NODE_OPTIONS from scripts.
Node version 18 does not seem to like it.
2023-09-09 20:37:08 +02:00
D. Berge
8a815ce3ef Add database upgrade file 26. 2023-09-09 20:23:20 +02:00
D. Berge
91076a50ad Show API error messages if available 2023-09-09 17:00:32 +02:00
D. Berge
e624dcdde0 Support async API callbacks in Vuex action 2023-09-09 16:59:43 +02:00
D. Berge
a25676122c Update material design icons dependency 2023-09-09 16:58:44 +02:00
D. Berge
e4dfbe2c9a Update minimum node version to 18 2023-09-09 16:57:20 +02:00
D. Berge
78fb34d049 Update the API version number 2023-09-09 16:56:52 +02:00
D. Berge
38c4125f4f Support patching values out of the configuration.
A configuration patch having keys with null values will result
in those keys being removed from the configuration.
2023-09-09 16:53:42 +02:00
D. Berge
04d6cbafe3 Use refactored database API in QC executable 2023-09-09 16:42:30 +02:00
D. Berge
e6319172d8 Fix typo in QC executable 2023-09-09 16:42:00 +02:00
D. Berge
5230ff63e3 Use new database API calls for configuration 2023-09-09 16:39:53 +02:00
D. Berge
2b364bbff7 Make bin script compatible with Python 3.6 2023-09-09 16:38:51 +02:00
D. Berge
c4b330b2bb Don't cache ETags for /files/ endpoint.
As we have no practical way of invalidating those.
2023-09-02 16:06:31 +02:00
D. Berge
308eda6342 Use ETag middleware 2023-09-02 15:29:39 +02:00
D. Berge
e8b1cb27f1 Add ETag middleware 2023-09-02 15:29:24 +02:00
D. Berge
ed14fd0ced Add notifier to DB library 2023-09-02 15:28:17 +02:00
D. Berge
fb10e56487 Add pg-listen dependency 2023-09-02 15:26:53 +02:00
D. Berge
56ed0cbc79 Merge branch '246-add-endpoint-for-creating-a-new-survey' into 'devel'
Resolve "Add endpoint for creating a new survey"

Closes #179, #174, and #246

See merge request wgp/dougal/software!29
2023-09-02 13:10:56 +00:00
D. Berge
227e588782 Merge branch '248-dougal-event-log-takes-a-long-time-to-register-new-events' into 'devel'
Resolve "Dougal event log takes a long time to register new events"

Closes #248

See merge request wgp/dougal/software!30
2023-09-02 13:09:56 +00:00
D. Berge
53f2108e37 Adapt import functions to use logical paths 2023-08-30 14:56:09 +02:00
D. Berge
ccf4bbf547 Use logical paths rather than physical 2023-08-30 14:54:27 +02:00
D. Berge
c99a625b60 Add function to retrieve survey configurations from DB.
As the survey definitions will no longer be stored in files
under etc/surveys/ but directly on the database, this
function replaces configuration.surveys()
2023-08-30 14:27:15 +02:00
D. Berge
25ab623328 Add functions for translating paths.
The Dougal database will no longer store physical file paths
but rather logical ones, relative to (config.yaml).imports.paths.

These functions translate between physical and logical paths.
2023-08-30 14:17:47 +02:00
D. Berge
455888bdac Fix method signature 2023-08-30 14:16:08 +02:00
D. Berge
b650ece0ee Add import.paths key to config.yaml.
Used to tell Dougal which parts of the filesystem may be
accessed by users via the API (more specifically, via the
`/files/` API endpoints).
2023-08-30 14:12:07 +02:00
D. Berge
2cb96c0252 Let user download P1s from the Sequences tab 2023-08-30 14:08:28 +02:00
D. Berge
70cf59bb4c Add API files endpoint.
Used to download files. It relies on `imports.paths` being set
appropriately in `etc/config.yaml` to indicate which parts of
the filesystem are accessible to users via Dougal.
2023-08-30 13:51:31 +02:00
D. Berge
ec03627119 Remove logging statements 2023-08-30 13:48:26 +02:00
D. Berge
675c19f060 Fix whitespace 2023-08-30 13:47:51 +02:00
D. Berge
6721b1b96b Add API endpoint for patching a project 2023-08-30 13:47:02 +02:00
D. Berge
b4f23822c4 Fix db.configuration.get() 2023-08-30 13:43:36 +02:00
D. Berge
3dd1aaeddb Fix indentation 2023-08-30 13:42:25 +02:00
D. Berge
1e593e6d75 Clean up if project creation fails 2023-08-30 13:41:28 +02:00
D. Berge
ddbcb90c1f Add deepMerge() utility function 2023-08-30 13:37:01 +02:00
D. Berge
229fdf20ef Reload the project list on insert or deletion 2023-08-23 19:35:12 +02:00
D. Berge
72e67d0e5d React to project deletion 2023-08-23 19:34:47 +02:00
D. Berge
b26fefbc37 Show user-friendly message if a project cannot be found 2023-08-23 19:33:50 +02:00
D. Berge
04e0482f60 Vuex: add getters for project info 2023-08-23 19:31:22 +02:00
D. Berge
62f90846a8 Vuex: clear project variables if project not found 2023-08-23 19:30:52 +02:00
D. Berge
1f9c0e56fe Default npm run serve to 0.0.0.0 2023-08-23 19:29:13 +02:00
D. Berge
fe9d3563a0 Add API endpoint to delete a project 2023-08-23 19:26:27 +02:00
D. Berge
38a07dffc6 Add API endpoint to retrieve project configuration.
Only available to users with at least `write` access.
2023-08-23 19:26:27 +02:00
D. Berge
1a6500308f Add API endpoint for creating a project 2023-08-23 19:26:27 +02:00
D. Berge
6033b45ed3 Refactor API middleware.
The middleware naming is kept consistent with the HTTP verb that
they handle.
2023-08-23 19:17:20 +02:00
D. Berge
33edef6647 Use modified body-parser accepting YAML 2023-08-23 19:12:44 +02:00
D. Berge
8f8e8b7492 Implement db.project.delete().
Removes a project from the database, but only if the project is
empty, i.e., it has no preplots, no lines and no events in its
log (except deleted).
2023-08-21 14:50:20 +02:00
D. Berge
ab5e3198aa Add DB function to return project configuration.
NOTE: mostly redundant with db.configuration.get(),
see previous commit.
2023-08-21 14:49:22 +02:00
D. Berge
60ed850d2d Change db.configuration.get() to use database.
NOTE: this endpoint is redundant with db.project.configuration.get()
except for its ability to return a partial tree.

TODO: merge this with db.project.configuration.get().
2023-08-21 14:46:51 +02:00
D. Berge
63b9cc5b16 Add database functions for project creation.
Instead of storing the project configuration in a YAML file
under `etc/surveys/`, this is now stored in public.projects.meta.

NOTE: as of this commit, the runner scripts (`bin/*.py`) are not
aware of this change and they will keep looking for project info
under `etc/surveys`. This means that projects created directly
in the database will be invisible to Dougal until the runner
scripts are changed accordingly.
2023-08-21 14:39:45 +02:00
D. Berge
f2edd2bec5 Refactor project DB functions.
The old db.project.list() function is now db.project.get()
and the old db.project.get() is not db.project.summary.get().

If a project does not exist, db.project.summary.get() now
throws a 404 rather than a database error.
2023-08-21 14:36:02 +02:00
D. Berge
44ad59130f Add pid2schema function.
Translates a project ID into a database schema name.
2023-08-21 14:31:23 +02:00
D. Berge
7cb2c3ef49 Add comment 2023-05-30 17:20:35 +02:00
D. Berge
ff4f6bfd78 Ensure that we're connected to the Dougal database 2023-05-30 17:19:23 +02:00
D. Berge
fbe0cb5efa Default the API prefix to /api 2023-05-18 18:34:10 +02:00
D. Berge
aa7cbed611 Do not require authentication to query API version 2023-05-18 18:32:26 +02:00
D. Berge
89061f6411 Print port and prefix on startup 2023-05-18 18:30:48 +02:00
D. Berge
838883d8a3 Update caniuse version (package-lock) 2023-05-18 18:29:44 +02:00
D. Berge
cd196f1acd Add option needed for node v16+ support.
Note: this may cause the client *not* to start on node versions
less than 16.
2023-05-18 18:28:36 +02:00
D. Berge
a2b894fceb Fix class instantiation error.
Closes #252.
2023-05-12 15:32:12 +02:00
D. Berge
c3b3a4c70f Remove lock file if inhibiting tasks 2023-04-11 20:50:59 +02:00
D. Berge
8118641231 Do not run tasks if required mounts are not present.
A configuration item `imports.mounts` is added to
`etc/config.yaml`. This should be a list of paths
which must be non-empty. If any of the paths in that
list is empty, runner.sh will abort.

Closes #200.
2023-04-10 15:04:12 +02:00
D. Berge
6d8a199a3c Allow setting IP to listen on.
Running on bare metal, 127.0.0.1 is a sensible choice of address
to bind on, but that is not the case when running inside a
container, so we add the ability to choose which IP to listen on.

This can be given via the environment variable HTTP_HOST when
starting the server or, if used as a module, as the second
argument of the start(port, host, path) function.
2023-04-07 09:04:51 +02:00
D. Berge
5a44e20a5b Do not error out of npm install if postinstall fails.
The postinstall script will (rightly) return non-zero if the API
docs cannot be built, but this creates a problem when building a
container (Docker) image. In that case, we expect the postinstall
to fail, as the required files (spec/*) will not have been copied
into the image when `npm install` is run.

By adding an explicit OR clause we allow postinstall to end
gracefully whether or not the API docs have been built.
2023-04-05 12:32:35 +02:00
D. Berge
374739c133 Request ancillary library via HTTPS rather than SSH.
Otherwise newer versions of npm will choke during `npm install` due
to this npm bug: https://github.com/npm/cli/issues/2610
2023-04-04 18:16:50 +02:00
158 changed files with 6232 additions and 1389 deletions

2
.gitignore vendored
View File

@@ -11,3 +11,5 @@ lib/www/client/dist/
etc/surveys/*.yaml
!etc/surveys/_*.yaml
etc/ssl/*
etc/config.yaml
var/*

27
bin/check_mounts_present.py Executable file
View File

@@ -0,0 +1,27 @@
#!/usr/bin/python3
"""
Check if any of the directories provided in the imports.mounts configuration
section are empty.
Returns 0 if all arguments are non-empty, 1 otherwise. It stops at the first
empty directory.
"""
import os
import configuration
cfg = configuration.read()
if cfg and "imports" in cfg and "mounts" in cfg["imports"]:
mounts = cfg["imports"]["mounts"]
for item in mounts:
with os.scandir(item) as contents:
if not any(contents):
exit(1)
else:
print("No mounts in configuration")
exit(0)

View File

@@ -1,4 +1,5 @@
import os
import pathlib
from glob import glob
from yaml import full_load as _load
@@ -11,6 +12,18 @@ surveys should be under $HOME/etc/surveys/*.yaml. In both cases,
$HOME is the home directory of the user running this script.
"""
def is_relative_to(it, other):
"""
is_relative_to() is not present version Python 3.9, so we
need this kludge to get Dougal to run on OpenSUSE 15.4
"""
if "is_relative_to" in dir(it):
return it.is_relative_to(other)
return str(it.absolute()).startswith(str(other.absolute()))
prefix = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
DOUGAL_ROOT = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
@@ -54,6 +67,10 @@ def files (globspec = None, include_archived = False):
quickly and temporarily “disabling” a survey configuration by renaming
the relevant file.
"""
print("This method is obsolete")
return
tuples = []
if globspec is None:
@@ -87,3 +104,73 @@ def rxflags (flagstr):
for flag in flagstr:
flags |= cases.get(flag, 0)
return flags
def translate_path (file):
"""
Translate a path from a Dougal import directory to an actual
physical path on disk.
Any user files accessible by Dougal must be under a path prefixed
by `(config.yaml).imports.paths`. The value of `imports.paths` may
be either a string, in which case this represents the prefix under
which all Dougal data resides, or a dictionary where the keys are
logical paths and their values the corresponding physical path.
"""
cfg = read()
root = pathlib.Path(DOUGAL_ROOT)
filepath = pathlib.Path(file).resolve()
import_paths = cfg["imports"]["paths"]
if filepath.is_absolute():
if type(import_paths) == str:
# Substitute the root for the real physical path
# NOTE: `root` deals with import_paths not being absolute
prefix = root.joinpath(pathlib.Path(import_paths)).resolve()
return str(pathlib.Path(prefix).joinpath(*filepath.parts[2:]))
else:
# Look for a match on the second path element
if filepath.parts[1] in import_paths:
# NOTE: `root` deals with import_paths[…] not being absolute
prefix = root.joinpath(import_paths[filepath.parts[1]])
return str(pathlib.Path(prefix).joinpath(*filepath.parts[2:]))
else:
# This path is invalid
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
# A relative filepath is always resolved relative to the logical root
root = pathlib.Path("/")
return translate_path(root.joinpath(filepath))
def untranslate_path (file):
"""
Attempt to convert a physical path into a logical one.
See `translate_path()` above for details.
"""
cfg = read()
dougal_root = pathlib.Path(DOUGAL_ROOT)
filepath = pathlib.Path(file).resolve()
import_paths = cfg["imports"]["paths"]
physical_root = pathlib.Path("/")
if filepath.is_absolute():
if type(import_paths) == str:
if is_relative_to(filepath, import_paths):
physical_root = pathlib.Path("/")
physical_prefix = pathlib.Path(import_paths)
return str(root.joinpath(filepath.relative_to(physical_prefix)))
else:
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
for key, value in import_paths.items():
value = dougal_root.joinpath(value)
physical_prefix = pathlib.Path(value)
if is_relative_to(filepath, physical_prefix):
logical_prefix = physical_root.joinpath(pathlib.Path(key)).resolve()
return str(logical_prefix.joinpath(filepath.relative_to(physical_prefix)))
# If we got here with no matches, this is not a valid
# Dougal data path
raise TypeError("invalid path or file: {0!r}".format(filepath))
else:
# A relative filepath is always resolved relative to DOUGAL_ROOT
return untranslate_path(root.joinpath(filepath))

View File

@@ -11,11 +11,9 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:

View File

@@ -52,7 +52,7 @@ class Datastore:
self.conn = psycopg2.connect(configuration.read()["db"]["connection_string"], **opts)
def set_autocommit(value = True):
def set_autocommit(self, value = True):
"""
Enable or disable autocommit.
@@ -95,7 +95,7 @@ class Datastore:
cursor.execute(qry, (filepath,))
results = cursor.fetchall()
if len(results):
return (filepath, file_hash(filepath)) in results
return (filepath, file_hash(configuration.translate_path(filepath))) in results
def add_file(self, path, cursor = None):
@@ -107,7 +107,8 @@ class Datastore:
else:
cur = cursor
hash = file_hash(path)
realpath = configuration.translate_path(path)
hash = file_hash(realpath)
qry = "CALL add_file(%s, %s);"
cur.execute(qry, (path, hash))
if cursor is None:
@@ -176,7 +177,7 @@ class Datastore:
else:
cur = cursor
hash = file_hash(path)
hash = file_hash(configuration.translate_path(path))
qry = """
UPDATE raw_lines rl
SET ntbp = %s
@@ -588,7 +589,63 @@ class Datastore:
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def get_file_data(self, path, cursor = None):
"""
Retrieve arbitrary data associated with a file.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
realpath = configuration.translate_path(path)
hash = file_hash(realpath)
qry = """
SELECT data
FROM file_data
WHERE hash = %s;
"""
cur.execute(qry, (hash,))
res = cur.fetchone()
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
return res[0]
def surveys (self, include_archived = False):
"""
Return list of survey definitions.
"""
if self.conn is None:
self.connect()
if include_archived:
qry = """
SELECT meta
FROM public.projects;
"""
else:
qry = """
SELECT meta
FROM public.projects
WHERE NOT (meta->'archived')::boolean IS true
"""
with self.conn:
with self.conn.cursor() as cursor:
cursor.execute(qry)
results = cursor.fetchall()
return [r[0] for r in results if r[0]]
# TODO Does this need tweaking on account of #246?
def apply_survey_configuration(self, cursor = None):
if cursor is None:
cur = self.conn.cursor()

View File

@@ -9,11 +9,9 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:

View File

@@ -51,12 +51,11 @@ def del_pending_remark(db, sequence):
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -77,29 +76,31 @@ if __name__ == '__main__':
pendingRx = re.compile(survey["final"]["pending"]["pattern"]["regex"])
for fileprefix in final_p111["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in final_p111["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
pending = False
if pendingRx:
pending = pendingRx.search(filepath) is not None
pending = pendingRx.search(physical_filepath) is not None
if not db.file_in_db(filepath):
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(filepath)
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
print("Skipping file because too new", logical_filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not match the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not match the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
@@ -108,21 +109,21 @@ if __name__ == '__main__':
file_info["meta"] = {}
if pending:
print("Skipping / removing final file because marked as PENDING", filepath)
print("Skipping / removing final file because marked as PENDING", logical_filepath)
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
continue
else:
del_pending_remark(db, file_info["sequence"])
p111_data = p111.from_file(filepath)
p111_data = p111.from_file(physical_filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_final_p111(p111_records, file_info, filepath, survey["epsg"])
db.save_final_p111(p111_records, file_info, logical_filepath, survey["epsg"])
else:
print("Already in DB")
if pending:

127
bin/import_map_layers.py Executable file
View File

@@ -0,0 +1,127 @@
#!/usr/bin/python3
"""
Import SmartSource data.
For each survey in configuration.surveys(), check for new
or modified final gun header files and (re-)import them into the
database.
"""
import os
import sys
import pathlib
import re
import time
import json
import configuration
from datastore import Datastore
if __name__ == '__main__':
"""
Imports map layers from the directories defined in the configuration object
`import.map.layers`. The content of that key is an object with the following
structure:
{
layer1Name: [
format: "geojson",
path: "", // Logical path to a directory
globs: [
"**/*.geojson", // List of globs matching map data files
]
],
layer2Name: …
}
"""
def process (layer_name, layer, physical_filepath):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
return
print("Importing")
file_info = {
"type": "map_layer",
"format": layer["format"],
"name": layer_name,
"tooltip": layer.get("tooltip"),
"popup": layer.get("popup")
}
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
file_info = db.get_file_data(logical_filepath)
dirty = False
if file_info:
if file_info["name"] != layer_name:
print("Renaming to", layer_name)
file_info["name"] = layer_name
dirty = True
if file_info.get("tooltip") != layer.get("tooltip"):
print("Changing tooltip to", layer.get("tooltip") or "null")
file_info["tooltip"] = layer.get("tooltip")
dirty = True
if file_info.get("popup") != layer.get("popup"):
print("Changing popup to", layer.get("popup") or "null")
file_info["popup"] = layer.get("popup")
dirty = True
if dirty:
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
print("Already in DB")
print("Reading configuration")
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
try:
map_layers = survey["imports"]["map"]["layers"]
except KeyError:
print("No map layers defined")
continue
for layer_name, layer_items in map_layers.items():
for layer in layer_items:
fileprefix = layer["path"]
realprefix = configuration.translate_path(fileprefix)
if os.path.isfile(realprefix):
process(layer_name, layer, realprefix)
elif os.path.isdir(realprefix):
if not "globs" in layer:
layer["globs"] = [ "**/*.geojson" ]
for globspec in layer["globs"]:
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
process(layer_name, layer, physical_filepath)
print("Done")

View File

@@ -17,29 +17,31 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading configuration")
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
for file in survey["preplots"]:
realpath = configuration.translate_path(file["path"])
print(f"Preplot: {file['path']}")
if not db.file_in_db(file["path"]):
age = time.time() - os.path.getmtime(file["path"])
age = time.time() - os.path.getmtime(realpath)
if age < file_min_age:
print("Skipping file because too new", file["path"])
continue
print("Importing")
try:
preplot = preplots.from_file(file)
preplot = preplots.from_file(file, realpath)
except FileNotFoundError:
print(f"File does not exist: {file['path']}", file=sys.stderr)
continue

View File

@@ -20,12 +20,11 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -46,30 +45,32 @@ if __name__ == '__main__':
ntbpRx = re.compile(survey["raw"]["ntbp"]["pattern"]["regex"])
for fileprefix in raw_p111["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in raw_p111["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if ntbpRx:
ntbp = ntbpRx.search(filepath) is not None
ntbp = ntbpRx.search(physical_filepath) is not None
else:
ntbp = False
if not db.file_in_db(filepath):
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(filepath)
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
print("Skipping file because too new", logical_filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not match the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not matching the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
@@ -77,7 +78,7 @@ if __name__ == '__main__':
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
p111_data = p111.from_file(filepath)
p111_data = p111.from_file(physical_filepath)
print("Saving")
@@ -85,7 +86,7 @@ if __name__ == '__main__':
if len(p111_records):
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
db.save_raw_p111(p111_records, file_info, logical_filepath, survey["epsg"], ntbp=ntbp)
else:
print("No source records found in file")
else:
@@ -93,7 +94,7 @@ if __name__ == '__main__':
# Update the NTBP status to whatever the latest is,
# as it might have changed.
db.set_ntbp(filepath, ntbp)
db.set_ntbp(logical_filepath, ntbp)
if ntbp:
print("Sequence is NTBP")

View File

@@ -20,12 +20,11 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
@@ -47,36 +46,38 @@ if __name__ == '__main__':
rx = re.compile(pattern["regex"], flags)
for fileprefix in raw_smsrc["paths"]:
print(f"Path prefix: {fileprefix}")
realprefix = configuration.translate_path(fileprefix)
print(f"Path prefix: {fileprefix}{realprefix}")
for globspec in raw_smsrc["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if not db.file_in_db(filepath):
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(filepath)
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
print("Skipping file because too new", logical_filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
match = rx.match(os.path.basename(logical_filepath))
if not match:
error_message = f"File path not matching the expected format! ({filepath} ~ {pattern['regex']})"
error_message = f"File path not matching the expected format! ({logical_filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
file_info = dict(zip(pattern["captures"], match.groups()))
smsrc_records = smsrc.from_file(filepath)
smsrc_records = smsrc.from_file(physical_filepath)
print("Saving")
db.save_raw_smsrc(smsrc_records, file_info, filepath)
db.save_raw_smsrc(smsrc_records, file_info, logical_filepath)
else:
print("Already in DB")

View File

@@ -15,25 +15,4 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
configs = configuration.files(include_archived = True)
print("Connecting to database")
db = Datastore()
#db.connect()
print("Reading surveys")
for config in configs:
filepath = config[0]
survey = config[1]
print(f'Survey: {survey["id"]} ({filepath})')
db.set_survey(survey["schema"])
if not db.file_in_db(filepath):
print("Saving to DB")
db.save_file_data(filepath, json.dumps(survey))
print("Applying survey configuration")
db.apply_survey_configuration()
else:
print("Already in DB")
print("Done")
print("This function is obsolete. Returning with no action")

View File

@@ -4,9 +4,10 @@ import sps
Preplot importing functions.
"""
def from_file (file):
def from_file (file, realpath = None):
filepath = realpath or file["path"]
if not "type" in file or file["type"] == "sps":
records = sps.from_file(file["path"], file["format"] if "format" in file else None )
records = sps.from_file(filepath, file["format"] if "format" in file else None )
else:
return "Not an SPS file"

View File

@@ -13,21 +13,27 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
print("Reading configuration")
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
for file in db.list_files():
path = file[0]
if not os.path.exists(path):
print(path, "NOT FOUND")
db.del_file(path)
try:
path = configuration.translate_path(file[0])
if not os.path.exists(path):
print(path, "NOT FOUND")
db.del_file(file[0])
except TypeError:
# In case the logical path no longer matches
# the Dougal configuration.
print(file[0], "COULD NOT BE TRANSLATED TO A PHYSICAL PATH. DELETING")
db.del_file(file[0])
print("Done")

View File

@@ -90,6 +90,12 @@ function run () {
rm $STDOUTLOG $STDERRLOG
}
function cleanup () {
if [[ -f $LOCKFILE ]]; then
rm "$LOCKFILE"
fi
}
if [[ -f $LOCKFILE ]]; then
PID=$(cat "$LOCKFILE")
if pgrep -F "$LOCKFILE"; then
@@ -107,6 +113,13 @@ echo "$$" > "$LOCKFILE" || {
}
print_info "Start run"
print_log "Check if data is accessible"
$BINDIR/check_mounts_present.py || {
print_warning "Import mounts not accessible. Inhibiting all tasks!"
cleanup
exit 253
}
print_log "Purge deleted files"
run $BINDIR/purge_deleted_files.py
@@ -119,18 +132,21 @@ run $BINDIR/import_preplots.py
print_log "Import raw P1/11"
run $BINDIR/import_raw_p111.py
print_log "Import raw P1/90"
run $BINDIR/import_raw_p190.py
#print_log "Import raw P1/90"
#run $BINDIR/import_raw_p190.py
print_log "Import final P1/11"
run $BINDIR/import_final_p111.py
print_log "Import final P1/90"
run $BINDIR/import_final_p190.py
#print_log "Import final P1/90"
#run $BINDIR/import_final_p190.py
print_log "Import SmartSource data"
run $BINDIR/import_smsrc.py
print_log "Import map user layers"
run $BINDIR/import_map_layers.py
# if [[ -z "$RUNNER_NOEXPORT" ]]; then
# print_log "Export system data"
# run $BINDIR/system_exports.py

View File

@@ -32,6 +32,25 @@ imports:
# least this many seconds ago.
file_min_age: 60
# These paths refer to remote mounts which must be present in order
# for imports to work. If any of these paths are empty, import actions
# (including data deletion) will be inhibited. This is to cope with
# things like transient network failures.
mounts:
- /srv/mnt/Data
# These paths can be exposed to end users via the API. They should
# contain the locations were project data, or any other user data
# that needs to be accessible by Dougal, is located.
#
# This key can be either a string or an object:
# - If a string, it points to the root path for Dougal-accessible data.
# - If an object, there is an implicit root and the first-level
# paths are denoted by the keys, with the values being their
# respective physical paths.
# Non-absolute paths are relative to $DOUGAL_ROOT.
paths: /srv/mnt/Data
queues:
asaqc:
request:

View File

@@ -1,3 +1,5 @@
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.12"}')
\connect dougal
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.12"}' WHERE public.info.key = 'version';
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 14.2
-- Dumped by pg_dump version 14.2
-- Dumped from database version 14.8
-- Dumped by pg_dump version 14.9
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -70,173 +70,171 @@ If the path matches that of an existing entry, delete that entry (which cascades
CREATE PROCEDURE _SURVEY__TEMPLATE_.adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT data->'planner'
INTO _planner_config
FROM file_data
WHERE data ? 'planner';
SELECT project_configuration()->'planner'
INTO _planner_config;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END IF;
END;
END;
$$;
@@ -367,8 +365,8 @@ COMMENT ON PROCEDURE _SURVEY__TEMPLATE_.augment_event_data(IN maxspan numeric) I
CREATE FUNCTION _SURVEY__TEMPLATE_.binning_parameters() RETURNS jsonb
LANGUAGE sql STABLE LEAKPROOF PARALLEL SAFE
AS $$
SELECT data->'binning' binning FROM file_data WHERE data->>'binning' IS NOT NULL LIMIT 1;
$$;
SELECT project_configuration()->'binning' binning;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.binning_parameters() OWNER TO postgres;
@@ -1042,6 +1040,39 @@ ALTER PROCEDURE _SURVEY__TEMPLATE_.log_midnight_shots(IN dt0 date, IN dt1 date)
COMMENT ON PROCEDURE _SURVEY__TEMPLATE_.log_midnight_shots(IN dt0 date, IN dt1 date) IS 'Add midnight shots between two dates dt0 and dt1 to the event_log, unless the events already exist.';
--
-- Name: project_configuration(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.project_configuration() RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
schema_name text;
configuration jsonb;
BEGIN
SELECT nspname
INTO schema_name
FROM pg_namespace
WHERE oid = (
SELECT pronamespace
FROM pg_proc
WHERE oid = 'project_configuration'::regproc::oid
);
SELECT meta
INTO configuration
FROM public.projects
WHERE schema = schema_name;
RETURN configuration;
END
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.project_configuration() OWNER TO postgres;
--
-- Name: replace_placeholders(text, timestamp with time zone, integer, integer); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -1488,9 +1519,9 @@ CREATE VIEW _SURVEY__TEMPLATE_.final_lines_summary AS
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
(( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.preplot_points
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.missing_sequence_final_points
WHERE missing_sequence_final_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
@@ -2137,9 +2168,9 @@ CREATE VIEW _SURVEY__TEMPLATE_.raw_lines_summary AS
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.preplot_points
WHERE ((preplot_points.line = rl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_preplots) AS missing_shots,
(SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.missing_sequence_raw_points
WHERE missing_sequence_raw_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,

View File

@@ -0,0 +1,162 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.3.13
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Fixes a bug in the `final_lines_summary` and `raw_lines_summary` views
-- which results in the number of missing shots being miscounted on jobs
-- using three sources.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(SELECT count(*) AS count
FROM missing_sequence_raw_points
WHERE missing_sequence_raw_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
( SELECT count(*) AS count
FROM missing_sequence_final_points
WHERE missing_sequence_final_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.13"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.13"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,122 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adapts the schema to the change in how project configurations are
-- handled (https://gitlab.com/wgp/dougal/software/-/merge_requests/29)
-- by creating a project_configuration() function which returns the
-- current project's configuration data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION project_configuration()
RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
schema_name text;
configuration jsonb;
BEGIN
SELECT nspname
INTO schema_name
FROM pg_namespace
WHERE oid = (
SELECT pronamespace
FROM pg_proc
WHERE oid = 'project_configuration'::regproc::oid
);
SELECT meta
INTO configuration
FROM public.projects
WHERE schema = schema_name;
RETURN configuration;
END
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' AND current_db_version != '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.0"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,264 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies adjust_planner() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT project_configuration()->'planner'
INTO _planner_config;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.1"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies binning_parameters() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION binning_parameters() RETURNS jsonb
LANGUAGE sql STABLE LEAKPROOF PARALLEL SAFE
AS $$
SELECT project_configuration()->'binning' binning;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,164 @@
-- Support notification payloads larger than Postgres' NOTIFY limit.
--
-- New schema version: 0.4.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema only.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new table where large notification payloads are stored
-- temporarily and from which they might be recalled by the notification
-- listeners. It also creates a purge_notifications() procedure used to
-- clean up old notifications from the notifications log and finally,
-- modifies notify() to support these changes. When a large payload is
-- encountered, the payload is stored in the notify_payloads table and
-- a trimmed down version containing a notification_id is sent to listeners
-- instead. Listeners can then query notify_payloads to retrieve the full
-- payloads. It is the application layer's responsibility to delete old
-- notifications.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_schema () AS $outer$
BEGIN
RAISE NOTICE 'Updating public schema';
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO public');
CREATE TABLE IF NOT EXISTS public.notify_payloads (
id SERIAL,
tstamp timestamptz NOT NULL DEFAULT CURRENT_TIMESTAMP,
payload text NOT NULL DEFAULT '',
PRIMARY KEY (id)
);
CREATE INDEX IF NOT EXISTS notify_payload_tstamp ON notify_payloads (tstamp);
CREATE OR REPLACE FUNCTION public.notify() RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE
channel text := TG_ARGV[0];
pid text;
payload text;
notification text;
payload_id integer;
BEGIN
SELECT projects.pid INTO pid FROM projects WHERE schema = TG_TABLE_SCHEMA;
payload := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'old', row_to_json(OLD),
'new', row_to_json(NEW),
'pid', pid
)::text;
IF octet_length(payload) < 1000 THEN
PERFORM pg_notify(channel, payload);
ELSE
-- We need to find another solution
-- FIXME Consider storing the payload in a temporary memory table,
-- referenced by some form of autogenerated ID. Then send the ID
-- as the payload and then it's up to the user to fetch the original
-- payload if interested. This needs a mechanism to expire older payloads
-- in the interest of conserving memory.
INSERT INTO notify_payloads (payload) VALUES (payload) RETURNING id INTO payload_id;
notification := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'pid', pid,
'payload_id', payload_id
)::text;
PERFORM pg_notify(channel, notification);
RAISE INFO 'Payload over limit';
END IF;
RETURN NULL;
END;
$$;
CREATE PROCEDURE public.purge_notifications (age_seconds numeric DEFAULT 120) AS $$
DELETE FROM notify_payloads WHERE EXTRACT(epoch FROM CURRENT_TIMESTAMP - tstamp) > age_seconds;
$$ LANGUAGE sql;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
-- This upgrade modified the `public` schema only, not individual
-- project schemas.
CALL pg_temp.upgrade_schema();
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_schema ();
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.3"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,104 @@
-- Add event_log_changes function
--
-- New schema version: 0.4.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds a function event_log_changes which returns the subset of
-- events from event_log_full which have been modified on or after a
-- given timestamp.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_changes(ts0 timestamptz)
RETURNS SETOF event_log_full
LANGUAGE sql
AS $$
SELECT *
FROM event_log_full
WHERE lower(validity) > ts0 OR upper(validity) IS NOT NULL AND upper(validity) > ts0
ORDER BY lower(validity);
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.4' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.4"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -9,7 +9,7 @@
"version": "0.0.0",
"license": "UNLICENSED",
"dependencies": {
"@mdi/font": "^5.6.55",
"@mdi/font": "^7.2.96",
"core-js": "^3.6.5",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",
@@ -1763,9 +1763,9 @@
}
},
"node_modules/@mdi/font": {
"version": "5.9.55",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-5.9.55.tgz",
"integrity": "sha512-jswRF6q3eq8NWpWiqct6q+6Fg/I7nUhrxYJfiEM8JJpap0wVJLQdbKtyS65GdlK7S7Ytnx3TTi/bmw+tBhkGmg=="
"version": "7.2.96",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-7.2.96.tgz",
"integrity": "sha512-e//lmkmpFUMZKhmCY9zdjRe4zNXfbOIJnn6xveHbaV2kSw5aJ5dLXUxcRt1Gxfi7ZYpFLUWlkG2MGSFAiqAu7w=="
},
"node_modules/@mrmlnc/readdir-enhanced": {
"version": "2.2.1",
@@ -3844,14 +3844,24 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001317",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001317.tgz",
"integrity": "sha512-xIZLh8gBm4dqNX0gkzrBeyI86J2eCjWzYAs40q88smG844YIrN4tVQl/RhquHvKEKImWWFIVh1Lxe5n1G/N+GQ==",
"version": "1.0.30001476",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001476.tgz",
"integrity": "sha512-JmpktFppVSvyUN4gsLS0bShY2L9ZUslHLE72vgemBkS43JD2fOvKTKs+GtRwuxrtRGnwJFW0ye7kWRRlLJS9vQ==",
"dev": true,
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/browserslist"
}
"funding": [
{
"type": "opencollective",
"url": "https://opencollective.com/browserslist"
},
{
"type": "tidelift",
"url": "https://tidelift.com/funding/github/npm/caniuse-lite"
},
{
"type": "github",
"url": "https://github.com/sponsors/ai"
}
]
},
"node_modules/case-sensitive-paths-webpack-plugin": {
"version": "2.4.0",
@@ -16432,9 +16442,9 @@
}
},
"@mdi/font": {
"version": "5.9.55",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-5.9.55.tgz",
"integrity": "sha512-jswRF6q3eq8NWpWiqct6q+6Fg/I7nUhrxYJfiEM8JJpap0wVJLQdbKtyS65GdlK7S7Ytnx3TTi/bmw+tBhkGmg=="
"version": "7.2.96",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-7.2.96.tgz",
"integrity": "sha512-e//lmkmpFUMZKhmCY9zdjRe4zNXfbOIJnn6xveHbaV2kSw5aJ5dLXUxcRt1Gxfi7ZYpFLUWlkG2MGSFAiqAu7w=="
},
"@mrmlnc/readdir-enhanced": {
"version": "2.2.1",
@@ -18175,9 +18185,9 @@
}
},
"caniuse-lite": {
"version": "1.0.30001317",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001317.tgz",
"integrity": "sha512-xIZLh8gBm4dqNX0gkzrBeyI86J2eCjWzYAs40q88smG844YIrN4tVQl/RhquHvKEKImWWFIVh1Lxe5n1G/N+GQ==",
"version": "1.0.30001476",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001476.tgz",
"integrity": "sha512-JmpktFppVSvyUN4gsLS0bShY2L9ZUslHLE72vgemBkS43JD2fOvKTKs+GtRwuxrtRGnwJFW0ye7kWRRlLJS9vQ==",
"dev": true
},
"case-sensitive-paths-webpack-plugin": {

View File

@@ -3,11 +3,11 @@
"version": "0.0.0",
"private": true,
"scripts": {
"serve": "vue-cli-service serve",
"serve": "vue-cli-service serve --host=0.0.0.0",
"build": "vue-cli-service build"
},
"dependencies": {
"@mdi/font": "^5.6.55",
"@mdi/font": "^7.2.96",
"core-js": "^3.6.5",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",

View File

@@ -44,7 +44,7 @@
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
@@ -64,7 +64,7 @@
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
@@ -256,6 +256,15 @@
>
Cancel
</v-btn>
<v-btn v-if="!id && (entrySequence || entryPoint)"
color="info"
text
title="Enter an event by time"
@click="timed"
>
<v-icon left small>mdi-clock-outline</v-icon>
Timed
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
@@ -632,6 +641,14 @@ export default {
}
},
timed () {
const tstamp = (new Date()).toISOString();
this.entrySequence = null;
this.entryPoint = null;
this.tsDate = tstamp.substr(0, 10);
this.tsTime = tstamp.substr(11, 8);
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);

View File

@@ -2,8 +2,8 @@
<div class="line-status" v-if="sequences.length == 0">
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence"
<div class="line-status" v-else-if="sequenceHref || plannedSequenceHref || pendingReshootHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence" v-if="sequenceHref"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
@@ -11,15 +11,41 @@
:to="sequenceHref(sequence)"
>
</router-link>
<router-link v-for="sequence in plannedSequences" :key="sequence.sequence" v-if="plannedSequenceHref"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
:to="plannedSequenceHref(sequence)"
>
</router-link>
<router-link v-for="(line, key) in pendingReshoots" :key="key" v-if="pendingReshootHref"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
:to="pendingReshootHref(line)"
>
</router-link>
</div>
<div class="line-status" v-else>
<div v-for="sequence in sequences"
<div v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
:title="title(sequence)"
>
</div>
<div v-for="sequence in plannedSequences" :key="sequence.sequence"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
>
</div>
<div v-for="(line, key) in pendingReshoots" :key="key"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
>
</div>
</div>
</template>
@@ -48,6 +74,8 @@
background-color blue
&.planned
background-color magenta
&.reshoot
background repeating-linear-gradient(-45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px), repeating-linear-gradient(45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px)
</style>
<script>
@@ -58,7 +86,11 @@ export default {
props: {
preplot: Object,
sequences: Array,
"sequence-href": Function
"sequence-href": Function,
"planned-sequences": Array,
"planned-sequence-href": Function,
"pending-reshoots": Array,
"pending-reshoot-href": Function
},
methods: {
@@ -68,13 +100,13 @@ export default {
? s.fsp_final
: s.status == "ntbp"
? (s.fsp_final || s.fsp)
: s.fsp; /* status == "raw" */
: s.fsp; /* status == "raw" or planned sequence or pending reshoot */
const lsp = s.status == "final"
? s.lsp_final
: s.status == "ntbp"
? (s.lsp_final || s.lsp)
: s.lsp; /* status == "raw" */
: s.lsp; /* status == "raw" or planned sequence or pending reshoot */
const pp0 = Math.min(this.preplot.fsp, this.preplot.lsp);
const pp1 = Math.max(this.preplot.fsp, this.preplot.lsp);
@@ -91,20 +123,24 @@ export default {
return values;
},
title (s) {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: s.status == "planned"
? "Planned"
: s.status;
title (s, type) {
if (s.status || type == "planned") {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: type == "planned"
? "Planned"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
} else if (type == "reshoot") {
return `Pending reshoot (${s.fsp}${s.lsp})${s.remarks? "\n"+s.remarks : ""}`;
}
}
}

View File

@@ -5,6 +5,11 @@ import api from './modules/api'
import user from './modules/user'
import snack from './modules/snack'
import project from './modules/project'
import event from './modules/event'
import label from './modules/label'
import sequence from './modules/sequence'
import plan from './modules/plan'
import line from './modules/line'
import notify from './modules/notify'
Vue.use(Vuex)
@@ -15,6 +20,11 @@ export default new Vuex.Store({
user,
snack,
project,
event,
label,
sequence,
plan,
line,
notify
}
})

View File

@@ -16,7 +16,7 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
const url = /^https?:\/\//i.test(resource) ? resource : (state.apiUrl + resource);
const res = await fetch(url, init);
if (typeof cb === 'function') {
cb(null, res);
await cb(null, res);
}
if (res.ok) {
@@ -35,7 +35,14 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
throw err;
}
} else {
await dispatch('showSnack', [res.statusText, "warning"]);
let message = res.statusText;
if (res.headers.get("Content-Type").match(/^application\/json/i)) {
const body = await res.json();
if (body.message) {
message = body.message;
}
}
await dispatch('showSnack', [message, "warning"]);
}
} catch (err) {
if (err && err.name == "AbortError") return;

View File

@@ -0,0 +1,129 @@
/** Fetch events from server
*/
async function refreshEvents ({commit, dispatch, state, rootState}, [modifiedAfter] = []) {
if (!modifiedAfter) {
modifiedAfter = state.timestamp;
}
if (state.loading) {
commit('abortEventsLoading');
}
commit('setEventsLoading');
const pid = rootState.project.projectId;
const url = modifiedAfter
? `/project/${pid}/event/changes/${(new Date(modifiedAfter)).toISOString()}?unique=t`
: `/project/${pid}/event`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
if (modifiedAfter) {
commit('setModifiedEvents', res);
} else {
commit('setEvents', res);
}
commit('setEventsTimestamp');
}
commit('clearEventsLoading');
}
/** Return a subset of events from state.events
*/
async function getEvents ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text, label}]) {
let filteredEvents = [...state.events];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredEvents.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredEvents.reverse();
}
});
}
if (sequence) {
filteredEvents = filteredEvents.filter( event => event.sequence == sequence );
}
if (date0 && date1) {
filteredEvents = filteredEvents.filter( event =>
event.tstamp.substr(0, 10) >= date0 && event.tstamp.substr(0, 10) <= date1
);
} else if (date0) {
filteredEvents = filteredEvents.filter( event => event.tstamp.substr(0, 10) == date0 );
}
if (text) {
const tstampFilter = (value, search, item) => {
return textFilter(value, search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
tstamp: tstampFilter,
sequence: numberFilter,
point: numberFilter,
remarks: textFilter,
labels: (value, search, item) => value.some(label => textFilter(label, search, item))
};
filteredEvents = filteredEvents.filter ( event => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(event[key], text, event)) {
return true;
}
}
return false;
});
}
if (label) {
filteredEvents = filteredEvents.filter( event => event.labels?.includes(label) );
}
const count = filteredEvents.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredEvents = filteredEvents.slice(offset, offset+itemsPerPage);
}
return {events: filteredEvents, count};
}
export default { refreshEvents, getEvents };

View File

@@ -0,0 +1,14 @@
function events (state) {
return state.events;
}
function eventCount (state) {
return state.events?.length ?? 0;
}
function eventsLoading (state) {
return !!state.loading;
}
export default { events, eventCount, eventsLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,73 @@
function setEvents (state, events) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.events = Object.freeze(events);
}
/** Selectively replace / insert / delete events
* from state.events.
*
* modifiedEvents is the result of
* /api/project/:project/event/changes?unique=t
*/
function setModifiedEvents (state, modifiedEvents) {
const events = [...state.events];
for (let evt of modifiedEvents) {
const idx = events.findIndex(i => i.id == evt.id);
if (idx != -1) {
if (evt.is_deleted) {
events.splice(idx, 1);
} else {
delete evt.is_deleted;
events.splice(idx, 1, evt);
}
} else {
if (!evt.is_deleted) {
delete evt.is_deleted;
events.unshift(evt);
}
}
}
setEvents(state, events);
}
function setEventsLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
function clearEventsLoading (state) {
state.loading = null;
}
function setEventsTimestamp (state, timestamp = new Date()) {
if (timestamp === true) {
const tstamp = state.events
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setEventsETag (state, etag) {
state.etag = etag;
}
function abortEventsLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setEvents,
setModifiedEvents,
setEventsLoading,
clearEventsLoading,
abortEventsLoading,
setEventsTimestamp,
setEventsETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
events: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,106 @@
/** Fetch labels from server
*/
async function refreshLabels ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortLabelsLoading');
}
commit('setLabelsLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/label`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setLabels', res);
commit('setLabelsTimestamp');
}
commit('clearLabelsLoading');
}
/** Return a subset of labels from state.labels.
*
* Note that, unlike other actions in the get* family,
* the return value is not isomorphic to the state.
*
* While state.labels is an object, getLabels() returns
* an array with each item have the shape:
*
* { label: "labelName", view: {…}, model: {…} }
*
* This is intended to be useful, for instance, for a table
* of labels.
*/
async function getLabels ({commit, dispatch, state}, [projectId, {sortBy, sortDesc, itemsPerPage, page, text, label}]) {
let filteredLabels = Object.entries(state.labels).map(i => {
return {
label: i[0],
...i[1]
}
});
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredLabels.sort( (el0, el1) => {
const a = key == "label" ? el0[0] : el0[1].view[key];
const b = key == "label" ? el1[0] : el1[1].view[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredLabels.reverse();
}
});
}
if (label) {
filteredLabels = filteredLabels.filter( label => label.label == label );
}
if (text) {
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
label: numberFilter,
description: textFilter,
};
filteredLabels = filteredLabels.filter ( item => {
return textFilter(item.label, text, item) ?? textFilter(item.view.description, text, item);
});
}
const count = filteredLabels.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredLabels = filteredLabels.slice(offset, offset+itemsPerPage);
}
return {labels: filteredLabels, count};
}
export default { refreshLabels, getLabels };

View File

@@ -0,0 +1,22 @@
function labels (state) {
return state.labels;
}
/** Return labels that can be added by users.
*
* As opposed to system labels.
*/
function userLabels (state) {
return Object.fromEntries(Object.entries(state.labels).filter(i => i[1].model.user));
}
function labelCount (state) {
return state.labels?.length ?? 0;
}
function labelsLoading (state) {
return !!state.loading;
}
export default { labels, userLabels, labelCount, labelsLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setLabels (state, labels) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.labels = Object.freeze(labels);
}
function setLabelsLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearLabelsLoading (state) {
state.loading = null;
}
function setLabelsTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the labels
// result or in the database schema, but we could add
// one.
if (timestamp === true) {
const tstamp = state.labels
.map( i => i.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setLabelsETag (state, etag) {
state.etag = etag;
}
function abortLabelsLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setLabels,
setLabelsLoading,
clearLabelsLoading,
setLabelsTimestamp,
setLabelsETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
labels: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,117 @@
/** Fetch lines from server
*/
async function refreshLines ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortLinesLoading');
}
commit('setLinesLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/line`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setLines', res);
commit('setLinesTimestamp');
}
commit('clearLinesLoading');
}
/** Return a subset of lines from state.lines
*/
async function getLines ({commit, dispatch, state}, [projectId, {line, fsp, lsp, incr, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredLines = [...state.lines];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredLines.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredLines.reverse();
}
});
}
if (line) {
filteredLines = filteredLines.filter( line => line.line == line );
}
if (fsp) {
filteredLines = filteredLines.filter( line => line.fsp == fsp );
}
if (lsp) {
filteredLines = filteredLines.filter( line => line.lsp == lsp );
}
if (text) {
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const incrFilter = (value, search, item) => {
const inc = /^(incr(ement)?|↑|\+)/i;
const dec = /^(decr(ement)?|↓|-)/i;
return (inc.test(search) && value) || (dec.test(search) && !value)
}
const searchFunctions = {
line: numberFilter,
fsp: numberFilter,
lsp: numberFilter,
remarks: textFilter,
incr: incrFilter,
ntba: (value, search, item) => text.toLowerCase() == "ntba" && value
};
filteredLines = filteredLines.filter ( line => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(line[key], text, line)) {
return true;
}
}
return false;
});
}
const count = filteredLines.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredLines = filteredLines.slice(offset, offset+itemsPerPage);
}
return {lines: filteredLines, count};
}
export default { refreshLines, getLines };

View File

@@ -0,0 +1,14 @@
function lines (state) {
return state.lines;
}
function lineCount (state) {
return state.lines?.length ?? 0;
}
function linesLoading (state) {
return !!state.loading;
}
export default { lines, lineCount, linesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setLines (state, lines) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.lines = Object.freeze(lines);
}
function setLinesLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearLinesLoading (state) {
state.loading = null;
}
function setLinesTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the lines
// result or in the database schema, but we could perhaps add
// one.
if (timestamp === true) {
const tstamp = state.lines
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setLinesETag (state, etag) {
state.etag = etag;
}
function abortLinesLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setLines,
setLinesLoading,
clearLinesLoading,
setLinesTimestamp,
setLinesETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
lines: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,114 @@
/** Fetch sequences from server
*/
async function refreshPlan ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortPlanLoading');
}
commit('setPlanLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/plan`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setPlan', res);
commit('setPlanTimestamp');
}
commit('clearPlanLoading');
}
/** Return a subset of sequences from state.sequences
*/
async function getPlannedSequences ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredPlannedSequences = [...state.sequences];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredPlannedSequences.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredPlannedSequences.reverse();
}
});
}
if (sequence) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence => sequence.sequence == sequence );
}
if (date0 && date1) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence =>
sequence.ts0.substr(0, 10) >= date0 && sequence.ts1.substr(0, 10) <= date1
);
} else if (date0) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence => sequence.ts0.substr(0, 10) == date0 || sequence.ts1.substr(0, 10) );
}
if (text) {
const tstampFilter = (value, search, item) => {
return textFilter(value.toISOString(), search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
sequence: numberFilter,
line: numberFilter,
remarks: textFilter,
ts0: tstampFilter,
ts1: tstampFilter
};
filteredPlannedSequences = filteredPlannedSequences.filter ( sequence => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(sequence[key], text, sequence)) {
return true;
}
}
return false;
});
}
const count = filteredPlannedSequences.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredPlannedSequences = filteredPlannedSequences.slice(offset, offset+itemsPerPage);
}
return {sequences: filteredPlannedSequences, count};
}
export default { refreshPlan, getPlannedSequences };

View File

@@ -0,0 +1,18 @@
function planRemarks (state) {
return state.remarks;
}
function plannedSequences (state) {
return state.sequences;
}
function plannedSequenceCount (state) {
return state.sequences?.length ?? 0;
}
function plannedSequencesLoading (state) {
return !!state.loading;
}
export default { planRemarks, plannedSequences, plannedSequenceCount, plannedSequencesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,59 @@
function transform (item) {
item.ts0 = new Date(item.ts0);
item.ts1 = new Date(item.ts1);
return item;
}
// ATTENTION: This relies on the new planner endpoint
// as per issue #281.
function setPlan (state, plan) {
// We don't need or want the planned sequences array to be reactive
state.sequences = Object.freeze(plan.sequences.map(transform));
state.remarks = plan.remarks;
}
function setPlanLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearPlanLoading (state) {
state.loading = null;
}
function setPlanTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the plan
// result or in the database schema, but we should probably add
// one.
if (timestamp === true) {
const tstamp = state.plan
.map( item => item.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setPlanETag (state, etag) {
state.etag = etag;
}
function abortPlanLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setPlan,
setPlanLoading,
clearPlanLoading,
setPlanTimestamp,
setPlanETag
};

View File

@@ -0,0 +1,9 @@
const state = () => ({
sequences: Object.freeze([]),
remarks: null,
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -1,13 +1,19 @@
async function getProject ({commit, dispatch}, projectId) {
const res = await dispatch('api', [`/project/${String(projectId).toLowerCase()}`]);
const res = await dispatch('api', [`/project/${String(projectId).toLowerCase()}/configuration`]);
if (res) {
commit('setProjectName', res.name);
commit('setProjectId', res.pid);
commit('setProjectId', res.id?.toLowerCase());
commit('setProjectSchema', res.schema);
commit('setProjectConfiguration', res);
const recentProjects = JSON.parse(localStorage.getItem("recentProjects") || "[]")
recentProjects.unshift(res);
localStorage.setItem("recentProjects", JSON.stringify(recentProjects.slice(0, 3)));
} else {
commit('setProjectName', null);
commit('setProjectId', null);
commit('setProjectSchema', null);
commit('setProjectConfiguration', {});
}
}

View File

@@ -0,0 +1,18 @@
function projectId (state) {
return state.projectId;
}
function projectName (state) {
return state.projectName;
}
function projectSchema (state) {
return state.projectSchema;
}
function projectConfiguration (state) {
return state.projectConfiguration;
}
export default { projectId, projectName, projectSchema, projectConfiguration };

View File

@@ -11,4 +11,8 @@ function setProjectSchema (state, schema) {
state.projectSchema = schema;
}
export default { setProjectId, setProjectName, setProjectSchema };
function setProjectConfiguration (state, configuration) {
state.projectConfiguration = Object.freeze(configuration);
}
export default { setProjectId, setProjectName, setProjectSchema, setProjectConfiguration };

View File

@@ -1,7 +1,8 @@
const state = () => ({
projectId: null,
projectName: null,
projectSchema: null
projectSchema: null,
projectConfiguration: {}
});
export default state;

View File

@@ -0,0 +1,122 @@
/** Fetch sequences from server
*/
async function refreshSequences ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortSequencesLoading');
}
commit('setSequencesLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/sequence?files=true`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setSequences', res);
commit('setSequencesTimestamp');
}
commit('clearSequencesLoading');
}
/** Return a subset of sequences from state.sequences
*/
async function getSequences ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredSequences = [...state.sequences];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredSequences.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredSequences.reverse();
}
});
}
if (sequence) {
filteredSequences = filteredSequences.filter( sequence => sequence.sequence == sequence );
}
if (date0 && date1) {
filteredSequences = filteredSequences.filter( sequence =>
(sequence.ts0_final ?? sequence.ts0)?.substr(0, 10) >= date0 &&
(sequence.ts1_final ?? sequence.ts1)?.substr(0, 10) <= date1
);
} else if (date0) {
filteredSequences = filteredSequences.filter( sequence => (sequence.ts0_final ?? sequence.ts0)?.substr(0, 10) == date0 || (sequence.ts1_final ?? sequence.ts1)?.substr(0, 10) );
}
if (text) {
const tstampFilter = (value, search, item) => {
return search?.length >= 5 && textFilter(value, search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
ts0: tstampFilter,
ts1: tstampFilter,
ts0_final: tstampFilter,
ts1_final: tstampFilter,
sequence: numberFilter,
line: numberFilter,
fsp: numberFilter,
lsp: numberFilter,
fsp_final: numberFilter,
fsp_final: numberFilter,
remarks: textFilter,
remarks_final: textFilter
};
filteredSequences = filteredSequences.filter ( sequence => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(sequence[key], text, sequence)) {
return true;
}
}
return false;
});
}
const count = filteredSequences.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredSequences = filteredSequences.slice(offset, offset+itemsPerPage);
}
return {sequences: filteredSequences, count};
}
export default { refreshSequences, getSequences };

View File

@@ -0,0 +1,14 @@
function sequences (state) {
return state.sequences;
}
function sequenceCount (state) {
return state.sequences?.length ?? 0;
}
function sequencesLoading (state) {
return !!state.loading;
}
export default { sequences, sequenceCount, sequencesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setSequences (state, sequences) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.sequences = Object.freeze(sequences);
}
function setSequencesLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearSequencesLoading (state) {
state.loading = null;
}
function setSequencesTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the sequences
// result or in the database schema, but we should probably add
// one.
if (timestamp === true) {
const tstamp = state.sequences
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setSequencesETag (state, etag) {
state.etag = etag;
}
function abortSequencesLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setSequences,
setSequencesLoading,
clearSequencesLoading,
setSequencesTimestamp,
setSequencesETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
sequences: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -39,6 +39,12 @@
{{ $refs.calendar.title }}
</v-toolbar-title>
<v-spacer></v-spacer>
<v-btn v-if="categoriesAvailable"
small
class="mx-4"
v-model="useCategories"
@click="useCategories = !useCategories"
>Labels {{useCategories ? "On" : "Off"}}</v-btn>
<v-menu bottom right>
<template v-slot:activator="{ on, attrs }">
<v-btn
@@ -72,16 +78,23 @@
<v-calendar
ref="calendar"
v-model="focus"
:events="events"
:events="items"
:event-color="getEventColour"
color="primary"
:type="type"
:type="view"
:locale-first-day-of-year="4"
:weekdays="weekdays"
:show-week="true"
:category-days="categoryDays"
:categories="categories"
@click:date="showLogForDate"
@click:event="showLogForEvent"
></v-calendar>
@change="setSpan"
>
<template v-slot:event="{ event }">
<div style="height:100%;overflow:scroll;" v-html="event.name"></div>
</template>
</v-calendar>
</v-sheet>
</div>
</template>
@@ -97,8 +110,9 @@ export default {
weekdays: [1, 2, 3, 4, 5, 6, 0],
type: "week",
focus: "",
events: [
],
items: [],
useCategories: false,
span: {},
options: {
sortBy: "sequence"
}
@@ -117,28 +131,126 @@ export default {
return labels[this.type];
},
...mapGetters(['loading'])
view () {
return this.useCategories ? "category" : this.type;
},
categoriesAvailable () {
return this.type == "day" || this.type == "4day";
},
categoryDays () {
if (this.useCategories) {
const days = {
month: 30,
week: 7,
"4day": 4,
day: 1
};
return days[this.type];
}
},
visibleItems () {
return this.items.filter(i => {
const end = i.end ?? i.start;
if (i.start > this.span.end) {
return false;
}
if (end < this.span.start) {
return false;
}
return true;
});
},
categories () {
return [...new Set(this.visibleItems.map(i => i.category ?? "General"))];
},
...mapGetters(['sequencesLoading', 'sequences', 'events'])
},
watch: {
sequences () {
const isFirstLoad = !this.items.length;
this.getEvents();
if (isFirstLoad) {
this.setLast();
}
},
events () {
const isFirstLoad = !this.items.length;
this.getEvents();
if (isFirstLoad) {
this.setLast();
}
},
type () {
this.getEvents();
},
categoriesAvailable (value) {
if (!value) {
this.useCategories = false;
}
}
},
methods: {
async getEvents () {
const query = new URLSearchParams(this.options);
const url = `/project/${this.$route.params.project}/sequence?${query.toString()}`;
const finalSequences = await this.api([url]) || [];
this.events = finalSequences.map(s => {
const sequences = this.sequences.map(s => {
const e = {};
//e.start = s.ts0.substring(0,10)+" "+s.ts0.substring(11,19)
//e.end = s.ts1.substring(0,10)+" "+s.ts1.substring(11,19)
e.routerLink = { name: "logBySequence", params: { sequence: s.sequence } };
e.start = new Date(s.ts0);
e.end = new Date(s.ts1);
e.timed = true;
e.colour = "orange";
e.name = `Sequence ${s.sequence}`;
e.name = `<b>Sequence ${s.sequence}</b><br/>Line ${s.line}<br/><abbr title="Shotpoints">SP</abbr> ${s.fgsp ?? s.fsp}${s.lgsp ?? s.lsp}`;
e.category = "Sequence"
return e;
});
const lineChanges = this.events.filter(i => i.meta?.["*ReportLineChangeTime*"]?.value && i.meta?.["*ReportLineChangeTime*"]?.type != "excess").map(i => {
const e = {};
const duration = i.meta?.["*ReportLineChangeTime*"]?.value;
e.end = new Date(i.tstamp);
e.start = new Date(e.end - duration);
e.timed = true;
e.colour = "pink";
e.name = "Line change";
e.category = "Production"
return e;
});
const excludedLabels = [ "FSP", "FGSP", "LSP", "LGSP", "QC" ];
const otherEvents = this.events.filter(i => !excludedLabels.some(l => i.labels.includes(l))).map(i => {
const e = {};
e.start = new Date(i.tstamp);
e.colour = "brown";
e.timed = true;
e.name = this.$options.filters.markdownInline(i.remarks);
e.category = i.labels[0];
return e;
});
this.items = [...sequences];
if (this.type == "day" || this.type == "4day") {
this.items.push(...lineChanges, ...otherEvents);
}
},
getEventColour (event) {
@@ -150,11 +262,15 @@ export default {
},
setFirst () {
this.focus = this.events[this.events.length-1].start;
if (this.items.length) {
this.focus = this.items[this.items.length-1].start;
}
},
setLast () {
this.focus = this.events[0].start;
if (this.items.length) {
this.focus = this.items[0].start;
}
},
prev () {
@@ -175,6 +291,13 @@ export default {
}
},
setSpan (span) {
this.span = {
start: new Date(span.start.date),
end: new Date((new Date(span.end.date)).valueOf() + 86400000)
};
},
...mapActions(["api"])
@@ -182,9 +305,7 @@ export default {
async mounted () {
await this.getEvents();
if (this.events.length) {
this.setLast();
}
this.setLast();
}
}

View File

@@ -11,6 +11,7 @@
label="Filter"
single-line
clearable
hint="Filter by line number, first or last shotpoint or remarks. Use incr or + / decr or - to show only incrementing / decrementing lines"
></v-text-field>
</v-toolbar>
</v-card-title>
@@ -106,12 +107,14 @@
<v-data-table
:headers="headers"
:items="items"
item-key="line"
:items-per-page.sync="itemsPerPage"
:server-items-length="lineCount"
item-key="line"
:search="filter"
:loading="loading"
:fixed-header="true"
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
:loading="linesLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
:item-class="itemClass"
:show-select="selectOn"
v-model="selectedRows"
@@ -124,6 +127,10 @@
:preplot="item"
:sequences="sequences.filter(s => s.line == item.line)"
:sequence-href="(s) => `/projects/${$route.params.project}/log/sequence/${s.sequence}`"
:planned-sequences="plannedSequences.filter(s => s.line == item.line)"
:planned-sequence-href="() => `/projects/${$route.params.project}/plan`"
:pending-reshoots="null"
:pending-reshoot-href="null"
>
<template v-slot:empty>
<div v-if="!item.ntba" class="sequence" title="Virgin"></div>
@@ -161,7 +168,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="linesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -251,9 +258,10 @@ export default {
items: [],
selectOn: false,
selectedRows: [],
filter: null,
num_lines: null,
sequences: [],
filter: "",
options: {},
lineCount: null,
//sequences: [],
activeItem: null,
edit: null, // {line, key, value}
queuedReload: false,
@@ -273,11 +281,22 @@ export default {
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'linesLoading', 'lines', 'sequences', 'plannedSequences'])
},
watch: {
options: {
handler () {
this.fetchLines();
},
deep: true
},
async lines () {
await this.fetchLines();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.line == oldVal.line);
@@ -296,39 +315,9 @@ export default {
}
},
async serverEvent (event) {
if (event.payload.pid == this.$route.params.project) {
if (event.channel == "preplot_lines" || event.channel == "preplot_points") {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getLines();
} else {
this.queuedReload = true;
}
} else if ([ "planned_lines", "raw_lines", "final_lines" ].includes(event.channel)) {
if (!this.loading && !this.queuedReload) {
this.getSequences();
} else {
this.queuedReload = true;
}
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getLines();
this.getSequences();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getLines();
this.getSequences();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchLines();
}
},
@@ -468,43 +457,28 @@ export default {
}
},
async getNumLines () {
const projectInfo = await this.api([`/project/${this.$route.params.project}`]);
this.num_lines = projectInfo.lines;
},
async getLines () {
const url = `/project/${this.$route.params.project}/line`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
},
async getSequences () {
const urlS = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([urlS]) || [];
const urlP = `/project/${this.$route.params.project}/plan`;
const planned = await this.api([urlP]) || [];
planned.forEach(i => i.status = "planned");
this.sequences.push(...planned);
},
setActiveItem (item) {
this.activeItem = this.activeItem == item
? null
: item;
},
...mapActions(["api"])
async fetchLines (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getLines([this.$route.params.project, options]);
this.items = res.lines;
this.lineCount = res.count;
},
...mapActions(["api", "getLines"])
},
mounted () {
this.getLines();
this.getNumLines();
this.getSequences();
this.fetchLines();
// Initialise stylesheet
const el = document.createElement("style");

View File

@@ -72,6 +72,10 @@
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=application%2Fjson`"
title="Download as a generic JSON file"
>JSON</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=text%2Fcsv`"
title="Download as Comma Separated Values file"
>CSV</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=text%2Fhtml`"
title="Download as an HTML formatted file"
@@ -89,7 +93,21 @@
append-icon="mdi-magnify"
label="Filter"
single-line
hide-details></v-text-field>
clearable
hide-details>
<template v-slot:prepend-inner>
<v-chip v-if="labelSearch"
class="mr-1"
small
close
@click:close="labelSearch=null"
:color="labels[labelSearch] && labels[labelSearch].view.colour"
:title="labels[labelSearch] && labels[labelSearch].view.description"
:dark="labels[labelSearch] && labels[labelSearch].view.dark"
:light="labels[labelSearch] && labels[labelSearch].view.light"
>{{labelSearch}}</v-chip>
</template>
</v-text-field>
</v-toolbar>
</v-card-title>
<v-card-text>
@@ -211,13 +229,14 @@
:headers="headers"
:items="rows"
:items-per-page.sync="itemsPerPage"
:server-items-length="eventCount"
item-key="key"
:item-class="itemClass"
sort-by="tstamp"
:sort-desc="true"
:search="filter"
:custom-filter="searchTable"
:loading="loading"
:loading="eventsLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
@click:row="setActiveItem"
@@ -245,12 +264,12 @@
:dark="labels[label] && labels[label].view.dark"
:light="labels[label] && labels[label].view.light"
:key="label"
:href="$route.path+'?label='+encodeURIComponent(label)"
@click="labelSearch=label"
>{{label}}</v-chip>
</span>
<dougal-event-edit-history v-if="entry.has_edits && writeaccess"
:id="entry.id"
:disabled="loading"
:disabled="eventsLoading"
:labels="labels"
></dougal-event-edit-history>
<span v-if="entry.meta.readonly"
@@ -381,7 +400,6 @@ export default {
}
],
items: [],
labels: {},
options: {},
filter: "",
filterableLabels: [ "QC", "QCAccepted" ],
@@ -390,7 +408,6 @@ export default {
eventDialog: false,
eventLabelsDialog: false,
defaultEventTimestamp: null,
presetRemarks: null,
remarksMenu: null,
remarksMenuItem: null,
editedEvent: {},
@@ -440,17 +457,6 @@ export default {
return Object.values(rows);
},
userLabels () {
const filtered = {};
for (const key in this.labels) {
if (this.labels[key].model.user) {
filtered[key] = this.labels[key];
}
}
return filtered;
},
popularLabels () {
const tuples = this.items.flatMap( i => i.labels )
.filter( l => (this.labels[l]??{})?.model?.user )
@@ -462,6 +468,10 @@ export default {
.sort( (a, b) => b[1]-a[1] );
},
presetRemarks () {
return this.projectConfiguration?.events?.presetRemarks ?? [];
},
defaultSequence () {
if (this.$route.params.sequence) {
return Number(this.$route.params.sequence.split(";").pop());
@@ -470,19 +480,24 @@ export default {
}
},
...mapGetters(['user', 'writeaccess', 'loading', 'online', 'sequence', 'line', 'point', 'position', 'timestamp', 'lineName', 'serverEvent']),
...mapGetters(['user', 'writeaccess', 'eventsLoading', 'online', 'sequence', 'line', 'point', 'position', 'timestamp', 'lineName', 'serverEvent', 'events', 'labels', 'userLabels']),
...mapState({projectSchema: state => state.project.projectSchema})
},
watch: {
options: {
handler () {
//this.getEvents();
async handler () {
await this.fetchEvents();
},
deep: true
},
async events () {
console.log("Events changed");
await this.fetchEvents();
},
eventDialog (val) {
if (val) {
// If not online
@@ -490,30 +505,14 @@ export default {
}
},
async serverEvent (event) {
if (event.channel == "event" && event.payload.schema == this.projectSchema) {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getEvents();
} else {
this.queuedReload = true;
}
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchEvents();
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getEvents();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getEvents();
}
labelSearch () {
this.fetchEvents();
},
itemsPerPage (newVal, oldVal) {
@@ -570,50 +569,15 @@ export default {
}
},
async getEventCount () {
//this.eventCount = await this.api([`/project/${this.$route.params.project}/event/?count`]);
this.eventCount = null;
},
async getEvents (opts = {}) {
const query = new URLSearchParams(this.options);
if (this.options.itemsPerPage < 0) {
query.delete("itemsPerPage");
}
if (this.$route.params.sequence) {
query.set("sequence", this.$route.params.sequence);
}
if (this.$route.params.date0) {
query.set("date0", this.$route.params.date0);
}
if (this.$route.params.date1) {
query.set("date1", this.$route.params.date1);
}
const url = `/project/${this.$route.params.project}/event?${query.toString()}`;
this.queuedReload = false;
this.items = await this.api([url, opts]) || [];
},
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
},
async getPresetRemarks () {
const url = `/project/${this.$route.params.project}/configuration/events/presetRemarks`;
this.presetRemarks = await this.api([url]);
async fetchEvents (opts = {}) {
const options = {
text: this.filter,
label: this.labelSearch,
...this.options
};
const res = await this.getEvents([this.$route.params.project, options]);
this.items = res.events;
this.eventCount = res.count;
},
newItem (from = {}) {
@@ -687,7 +651,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -705,7 +669,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -752,7 +716,7 @@ export default {
if (!err && res.ok) {
this.showSnack([`${ids.length} events deleted`, "red"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -768,7 +732,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event deleted", "red"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -802,19 +766,6 @@ export default {
},
searchTable (value, search, item) {
if (!value && !search) return true;
const s = search.toLowerCase();
if (typeof value === 'string') {
return value.toLowerCase().includes(s);
} else if (typeof value === 'number') {
return value == search;
} else {
return item.items.some( i => i.remarks.toLowerCase().includes(s) ) ||
item.items.some( i => i.labels.some( l => l.toLowerCase().includes(s) ));
}
},
viewOnMap(item) {
if (item?.meta && item.meta?.geometry?.type == "Point") {
const [ lon, lat ] = item.meta.geometry.coordinates;
@@ -853,14 +804,11 @@ export default {
*/
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "refreshEvents", "getEvents"])
},
async mounted () {
await this.getLabelDefinitions();
this.getEventCount();
this.getEvents();
this.getPresetRemarks();
this.fetchEvents();
window.addEventListener('keyup', this.handleKeyboardEvent);
},

View File

@@ -375,7 +375,8 @@ export default {
}
],
labels: {},
hashMarker: null
hashMarker: null,
references: {}
};
},
@@ -474,7 +475,7 @@ export default {
bounds._northEast.lng,
bounds._northEast.lat
].map(i => i.toFixed(bboxScale)).join(",");
const limit = 10000;
const limit = 10000; // Empirical value
const query = new URLSearchParams({bbox, limit});
@@ -511,7 +512,9 @@ export default {
}
l.layer.clearLayers();
if (layer instanceof L.Layer || (layer.features && layer.features.length < limit) || ("length" in layer && layer.length < limit)) {
//if (layer instanceof L.Layer || (layer.features && layer.features.length < limit) || ("length" in layer && layer.length < limit)) {
if (layer instanceof L.Layer || ((layer.features?.length ?? layer?.length) < limit)) {
if (l.layer.addData) {
l.layer.addData(layer);
} else if (l.layer.addLayer) {
@@ -519,8 +522,12 @@ export default {
}
l.layer.lastRequestURL = url;
} else if (!layer.features) {
console.log(`Layer ${url} is empty`);
} else {
console.warn("Too much data from", url);
console.warn(`Too much data from ${url} (${layer.features?.length ?? layer.length}${limit} features)`);
this.showSnack([`Layer ${l.layer.options.userLayerName ? ""+l.layer.options.userLayerName+" " : ""}is too large: ${layer.features?.length ?? layer.length} features; maximum is ${limit}`, "error"]);
}
})
.finally( () => {
@@ -674,13 +681,140 @@ export default {
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
this.labels = await this.api([url]) || [];
},
...mapActions(["api"])
removeUserLayers () {
map.eachLayer( layer => {
if (layer.options.userLayer === true) {
console.log("Removing", layer);
layer.eachLayer( sublayer => {
const idx = this.layerRefreshConfig.findIndex(i => i.layer == layer);
if (idx != -1) {
this.layerRefreshConfig.splice(idx, 1);
}
});
map.removeLayer(layer);
this.references.layerControl.removeLayer(layer);
}
});
},
async addUserLayers (userLayers) {
const options = {
userLayer: true,
style (feature) {
const style = {
stroke: undefined,
color: "grey",
weight: 2,
opacity: 0.5,
lineCap: undefined,
lineJoin: undefined,
dashArray: undefined,
dashOffset: undefined,
fill: undefined,
fillColor: "lightgrey",
fillOpacity: 0.5,
fillRule: undefined
};
for (let key in style) {
switch (key) {
case "color":
style[key] = feature.properties?.colour ?? feature.properties?.color ?? style[key];
break;
case "fillColor":
style[key] = feature.properties?.fillColour ?? feature.properties?.fillColor ?? style[key];
break;
default:
style[key] = feature.properties?.[key] ?? style[key];
}
if (typeof style[key] === "undefined") {
delete style[key];
}
}
return style;
}
};
const userLayerGroups = {};
userLayers.forEach(layer => {
if (!(layer.name in userLayerGroups)) {
userLayerGroups[layer.name] = [];
}
userLayerGroups[layer.name].push(layer);
});
for (let userLayerName in userLayerGroups) {
const userLayerGroup = userLayerGroups[userLayerName];
const layer = L.featureGroup(null, {userLayer: true, userLayerGroup: true, userLayerName});
userLayerGroup.forEach(l => {
const sublayer = L.geoJSON(null, {...options, userLayerName});
layer.addLayer(sublayer);
sublayer.on('add', ({target}) => {
this.refreshLayers([target])
});
if (l.tooltip) {
sublayer.bindTooltip((layer) => {
return layer?.feature?.properties?.[l.tooltip] ?? userLayerName;
});
}
if (l.popup) {
if (l.popup === true) {
sublayer.bindPopup((layer) => {
const p = layer?.feature?.properties;
let t = "";
if (p) {
t += "<table>";
for (let [k, v] of Object.entries(p)) {
t += `<tr><td><b>${k}: </b></td><td>${v}</td></tr>`;
}
t += "</table>";
return t;
}
return userLayerName;
});
} else {
sublayer.binPopup((layer) => {
return layer?.feature?.properties?.[l.popup] ?? userLayerName;
});
}
}
const refreshConfig = {
layer: sublayer,
url: (query = "") => {
return `/files/${l.path}`;
}
};
this.layerRefreshConfig.push(refreshConfig);
});
layer.on('add', ({target}) => {
this.refreshLayers(target.getLayers())
});
this.references.layerControl.addOverlay(layer, `<span title="User layer" style="text-decoration: dotted underline;">${userLayerName}</span>`);
}
},
async fetchUserLayers () {
const url = `/project/${this.$route.params.project}/gis/layer`;
const userLayers = await this.api([url]) || [];
this.removeUserLayers();
this.addUserLayers(userLayers);
},
...mapActions(["api", "showSnack"])
},
@@ -750,7 +884,7 @@ export default {
}
if (init.activeLayers) {
init.activeLayers.forEach(l => layers[l].addTo(map));
init.activeLayers.forEach(l => layers[l]?.addTo(map));
} else {
layers.OpenSeaMap.addTo(map);
layers.Preplots.addTo(map);
@@ -759,6 +893,9 @@ export default {
const layerControl = L.control.layers(tileMaps, layers).addTo(map);
const scaleControl = L.control.scale().addTo(map);
this.references.layerControl = layerControl;
this.references.scaleControl = scaleControl;
if (init.position) {
map.setView(init.position.slice(1), init.position[0]);
} else {
@@ -786,10 +923,13 @@ export default {
map.on('layeradd', this.updateURL);
map.on('layerremove', this.updateURL);
this.layerRefreshConfig.forEach( l => {
l.layer.on('add', ({target}) => this.refreshLayers([target]));
});
this.fetchUserLayers();
if (init.position) {
this.refreshLayers();
} else {

View File

@@ -44,6 +44,7 @@
label="Filter"
single-line
clearable
hint="Filter by sequence, line, first or last shotpoints, remarks or start/end time"
></v-text-field>
</v-toolbar>
</v-card-title>
@@ -109,11 +110,14 @@
:headers="headers"
:items="items"
:items-per-page.sync="itemsPerPage"
:server-items-length="sequenceCount"
item-key="sequence"
:search="filter"
:loading="loading"
:fixed-header="true"
:loading="plannedSequencesLoading"
fixed-header
no-data-text="No planned lines. Add lines via the context menu from either the Lines or Sequences view."
:item-class="(item) => (activeItem == item && !edit) ? 'blue accent-1 elevation-3' : ''"
:footer-props="{showFirstLastPage: true}"
@click:row="setActiveItem"
@contextmenu:row="contextMenu"
>
@@ -275,7 +279,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="plannedSequencesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -417,7 +421,8 @@ export default {
remarks: null,
editRemarks: false,
filter: null,
num_lines: null,
options: {},
sequenceCount: null,
activeItem: null,
edit: null, // {sequence, key, value}
queuedReload: false,
@@ -552,11 +557,22 @@ export default {
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'plannedSequencesLoading', 'plannedSequences', 'planRemarks'])
},
watch: {
options: {
handler () {
this.fetchPlannedSequences();
},
deep: true
},
async plannedSequences () {
await this.fetchPlannedSequences();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.sequence == oldVal.sequence);
@@ -587,41 +603,9 @@ export default {
}
},
async serverEvent (event) {
if (event.channel == "planned_lines" && event.payload.pid == this.$route.params.project) {
// Ignore non-ops
/*
if (event.payload.old === null && event.payload.new === null) {
return;
}
*/
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getPlannedLines();
} else {
this.queuedReload = true;
}
} else if (event.channel == "info" && event.payload.pid == this.$route.params.project) {
if (event.payload?.new?.key == "plan" && ("remarks" in (event.payload?.new?.value || {}))) {
this.remarks = event.payload?.new.value.remarks;
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getPlannedLines();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getPlannedLines();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchPlannedSequences();
}
},
@@ -890,7 +874,6 @@ export default {
const url = `/project/${this.$route.params.project}/plan/${this.contextMenuItem.sequence}`;
const init = {method: "DELETE"};
await this.api([url, init]);
await this.getPlannedLines();
},
editItem (item, key, value) {
@@ -942,21 +925,6 @@ export default {
}
},
async getPlannedLines () {
const url = `/project/${this.$route.params.project}/plan`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
for (const item of this.items) {
item.ts0 = new Date(item.ts0);
item.ts1 = new Date(item.ts1);
this.wxQuery(item).then( (wx) => {
item.meta = {...item.meta, wx};
});
}
},
async getPlannerConfig () {
const url = `/project/${this.$route.params.project}/configuration/planner`;
this.plannerConfig = await this.api([url]) || {
@@ -967,14 +935,15 @@ export default {
}
},
async getPlannerRemarks () {
const url = `/project/${this.$route.params.project}/info/plan/remarks`;
this.remarks = await this.api([url]) || "";
},
async getSequences () {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([url]) || [];
async fetchPlannedSequences (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getPlannedSequences([this.$route.params.project, options]);
this.items = res.sequences;
this.sequenceCount = res.count;
this.remarks = this.planRemarks;
},
setActiveItem (item) {
@@ -983,13 +952,12 @@ export default {
: item;
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "getPlannedSequences"])
},
async mounted () {
await this.getPlannerConfig();
this.getPlannedLines();
this.getPlannerRemarks();
await this.fetchPlannedSequences();
}
}

View File

@@ -1,18 +1,25 @@
<template>
<v-container fluid fill-height class="ma-0 pa-0">
<v-row no-gutters align="stretch" class="fill-height">
<v-col cols="12">
<v-col cols="12" v-if="projectFound">
<!-- Show component here according to selected route -->
<keep-alive>
<router-view :key="$route.path"></router-view>
</keep-alive>
</v-col>
<v-col cols="12" v-else>
<v-card>
<v-card-text>
Project does not exist.
</v-card-text>
</v-card>
</v-col>
</v-row>
</v-container>
</template>
<script>
import { mapActions } from 'vuex'
import { mapActions, mapGetters } from 'vuex'
export default {
name: 'Project',
@@ -24,12 +31,53 @@ export default {
}
},
computed: {
projectFound () {
return this.loading || this.projectId;
},
...mapGetters(["loading", "projectId", "projectSchema", "serverEvent"])
},
watch: {
async serverEvent (event) {
if (event.channel == "project" && event.payload?.operation == "DELETE" && event.payload?.schema == "public") {
// Project potentially deleted
await this.getProject(this.$route.params.project);
} else if (event.payload?.schema == this.projectSchema) {
if (event.channel == "event") {
this.refreshEvents();
} else if (event.channel == "planned_lines") {
this.refreshPlan();
} else if (["raw_lines", "final_lines", "final_shots"].includes(event.channel)) {
this.refreshSequences();
} else if (["preplot_lines", "preplot_points"].includes(event.channel)) {
this.refreshLines();
} else if (event.channel == "info") {
if ((event.payload?.new ?? event.payload?.old)?.key == "plan") {
this.refreshPlan();
}
} else if (event.channel == "project") {
this.getProject(this.$route.params.project);
}
}
}
},
methods: {
...mapActions(["getProject"])
...mapActions(["getProject", "refreshLines", "refreshSequences", "refreshEvents", "refreshLabels", "refreshPlan"])
},
async mounted () {
await this.getProject(this.$route.params.project);
if (this.projectFound) {
this.refreshLines();
this.refreshSequences();
this.refreshEvents();
this.refreshLabels();
this.refreshPlan();
}
}
}

View File

@@ -83,12 +83,22 @@ export default {
},
computed: {
...mapGetters(['loading'])
...mapGetters(['loading', 'serverEvent'])
},
watch: {
async serverEvent (event) {
if (event.channel == "project" && event.payload?.schema == "public") {
if (event.payload?.operation == "DELETE" || event.payload?.operation == "INSERT") {
await this.load();
}
}
}
},
methods: {
async list () {
this.items = await this.api(["/project/"]) || [];
this.items = await this.api(["/project"]) || [];
},
async summary (item) {

View File

@@ -433,10 +433,7 @@ export default {
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
this.labels = await this.api([url]) || {};
},
async getQCData () {

View File

@@ -148,15 +148,16 @@
:headers="headers"
:items="items"
:items-per-page.sync="itemsPerPage"
:server-items-length="sequenceCount"
item-key="sequence"
:server-items-length="num_rows"
:search="filter"
:custom-filter="customFilter"
:loading="loading"
:fixed-header="true"
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
show-expand
:item-class="(item) => activeItem == item ? 'blue accent-1 elevation-3' : ''"
:search="filter"
x-custom-filter="customFilter"
:loading="sequencesLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
show-expand
@click:row="setActiveItem"
@contextmenu:row="contextMenu"
>
@@ -176,7 +177,7 @@
icon
small
title="Cancel edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit.value = item.remarks; edit = null"
>
<v-icon small>mdi-close</v-icon>
@@ -185,7 +186,7 @@
icon
small
title="Save edits"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
@@ -196,7 +197,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -210,7 +211,7 @@
class="markdown"
autofocus
placeholder="Enter your text here"
:disabled="loading"
:disabled="sequencesLoading"
v-model="edit.value"
>
</v-textarea>
@@ -228,7 +229,7 @@
icon
small
title="Cancel edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit.value = item.remarks_final; edit = null"
>
<v-icon small>mdi-close</v-icon>
@@ -237,7 +238,7 @@
icon
small
title="Save edits"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
@@ -248,7 +249,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="editItem(item, 'remarks_final')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -262,7 +263,7 @@
class="markdown"
autofocus
placeholder="Enter your text here"
:disabled="loading"
:disabled="sequencesLoading"
v-model="edit.value"
>
</v-textarea>
@@ -292,9 +293,13 @@
<v-list-item v-for="(path, index) in item.raw_files"
key="index"
link
title="View the shot log"
title="Download file"
:href="`/api/files${path}`"
>
{{ basename(path) }}
<v-list-item-action>
<v-icon right small>mdi-cloud-download</v-icon>
</v-list-item-action>
</v-list-item>
</v-list-group>
<v-list-group value="true" v-if="item.final_files">
@@ -308,10 +313,13 @@
</template>
<v-list-item v-for="(path, index) in item.final_files"
key="index"
link
title="View the shot log"
title="Download file"
:href="`/api/files${path}`"
>
{{ basename(path) }}
<v-list-item-action>
<v-icon right small>mdi-cloud-download</v-icon>
</v-list-item-action>
</v-list-item>
</v-list-group>
</v-list>
@@ -559,7 +567,7 @@ export default {
items: [],
filter: "",
options: {},
num_rows: null,
sequenceCount: null,
activeItem: null,
edit: null, // {sequence, key, value}
queuedReload: false,
@@ -586,17 +594,22 @@ export default {
return this.queuedItems.find(i => i.payload.sequence == this.contextMenuItem.sequence);
},
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'sequencesLoading', 'sequences'])
},
watch: {
options: {
handler () {
this.getSequences();
this.fetchSequences();
},
deep: true
},
async sequences () {
await this.fetchSequences();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.sequence == oldVal.sequence);
@@ -610,39 +623,9 @@ export default {
}
},
async serverEvent (event) {
const subscriptions = ["raw_lines", "final_lines", "final_shots"];
if (subscriptions.includes(event.channel) && event.payload.pid == this.$route.params.project) {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getSequences();
} else {
this.queuedReload = true;
}
} else if (event.channel == "queue_items") {
const project =
event.payload?.project ??
event.payload?.new?.payload?.project ??
event.payload?.old?.payload?.project;
if (project == this.$route.params.project) {
this.getQueuedItems();
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getSequences();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getSequences();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchSequences();
}
},
@@ -811,19 +794,14 @@ export default {
this.num_rows = projectInfo.sequences;
},
async getSequences () {
const query = new URLSearchParams(this.options);
query.set("filter", this.filter);
query.set("files", true);
if (this.options.itemsPerPage < 0) {
query.delete("itemsPerPage");
}
const url = `/project/${this.$route.params.project}/sequence?${query.toString()}`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
async fetchSequences (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getSequences([this.$route.params.project, options]);
this.items = res.sequences;
this.sequenceCount = res.count;
},
async getQueuedItems () {
@@ -871,11 +849,11 @@ export default {
return false;
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "getSequences"])
},
mounted () {
this.getSequences();
this.fetchSequences();
this.getNumLines();
this.getQueuedItems();
}

View File

@@ -4,6 +4,7 @@ module.exports = {
"leaflet-arrowheads"
],
devServer: {
host: "0.0.0.0",
proxy: {
"^/api(/|$)": {
target: "http://localhost:3000",

View File

@@ -1,6 +1,7 @@
const http = require('http');
const express = require('express');
express.yaml ??= require('body-parser').yaml; // NOTE: Use own customised body-parser
const cookieParser = require('cookie-parser')
const maybeSendAlert = require("../lib/alerts");
@@ -9,7 +10,7 @@ const mw = require('./middleware');
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const verbose = process.env.NODE_ENV != 'test';
const app = express();
app.locals.version = "0.3.1"; // API version
app.locals.version = "0.4.0"; // API version
app.map = function(a, route){
route = route || '';
@@ -31,6 +32,7 @@ app.map = function(a, route){
};
app.use(express.json({type: "application/json", strict: false, limit: '10mb'}));
app.use(express.yaml({type: "application/yaml", limit: '10mb'}));
app.use(express.urlencoded({ type: "application/x-www-form-urlencoded", extended: true }));
app.use(express.text({type: "text/*", limit: '10mb'}));
app.use((req, res, next) => {
@@ -87,13 +89,19 @@ app.use(mw.etag.ifNoneMatch);
// We must be authenticated before we can access these
app.map({
'/project': {
get: [ mw.project.list ], // Get list of projects
get: [ mw.project.get ], // Get list of projects
post: [ mw.auth.access.admin, mw.project.post ], // Create a new project
},
'/project/:project': {
get: [ mw.project.get ], // Get project data
get: [ mw.project.summary.get ], // Get project data
delete: [ mw.auth.access.admin, mw.project.delete ], // Delete a project (only if empty)
},
'/project/:project/summary': {
get: [ mw.project.get ],
get: [ mw.project.summary.get ],
},
'/project/:project/configuration': {
get: [ mw.project.configuration.get ], // Get project configuration
patch: [ mw.auth.access.admin, mw.project.configuration.patch ], // Modify project configuration
},
/*
@@ -101,19 +109,25 @@ app.map({
*/
'/project/:project/gis': {
get: [ mw.gis.project.bbox ]
get: [ mw.etag.noSave, mw.gis.project.bbox ]
},
'/project/:project/gis/preplot': {
get: [ mw.gis.project.preplot ]
get: [ mw.etag.noSave, mw.gis.project.preplot ]
},
'/project/:project/gis/preplot/:featuretype(line|point)': {
get: [ mw.gis.project.preplot ]
get: [ mw.etag.noSave, mw.gis.project.preplot ]
},
'/project/:project/gis/raw/:featuretype(line|point)': {
get: [ mw.gis.project.raw ]
get: [ mw.etag.noSave, mw.gis.project.raw ]
},
'/project/:project/gis/final/:featuretype(line|point)': {
get: [ mw.gis.project.final ]
get: [ mw.etag.noSave, mw.gis.project.final ]
},
'/project/:project/gis/layer': {
get: [ mw.etag.noSave, mw.gis.project.layer.get ]
},
'/project/:project/gis/layer/:name': {
get: [ mw.etag.noSave, mw.gis.project.layer.get ]
},
/*
@@ -167,6 +181,9 @@ app.map({
post: [ mw.auth.access.write, mw.event.post ],
put: [ mw.auth.access.write, mw.event.put ],
delete: [ mw.auth.access.write, mw.event.delete ],
'changes/:since': {
get: [ mw.event.changes ]
},
// TODO Rename -/:sequence → sequence/:sequence
'-/:sequence/': { // NOTE: We need to avoid conflict with the next endpoint ☹
get: [ mw.event.sequence.get ],
@@ -186,25 +203,25 @@ app.map({
'/project/:project/qc': {
'/results': {
// Get all QC results for :project
get: [ mw.qc.results.get ],
get: [ mw.etag.noSave, mw.qc.results.get ],
// Delete all QC results for :project
delete: [ mw.auth.access.write, mw.qc.results.delete ],
delete: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.delete ],
'/accept': {
post: [ mw.auth.access.write, mw.qc.results.accept ]
post: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.accept ]
},
'/unaccept': {
post: [ mw.auth.access.write, mw.qc.results.unaccept ]
post: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.unaccept ]
},
'/sequence/:sequence': {
// Get QC results for :project, :sequence
get: [ mw.qc.results.get ],
get: [ mw.etag.noSave, mw.qc.results.get ],
// Delete QC results for :project, :sequence
delete: [ mw.auth.access.write, mw.qc.results.delete ]
delete: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.delete ]
}
}
},
@@ -247,10 +264,16 @@ app.map({
// // post: [ mw.permissions.post ],
// // delete: [ mw.permissions.delete ]
// },
'/project/:project/files/:path(*)': {
get: [ mw.auth.access.write, mw.files.get ]
},
'/files/?:path(*)': {
get: [ mw.auth.access.write, mw.etag.noSave, mw.files.get ]
},
'/navdata/': {
get: [ mw.navdata.get ],
get: [ mw.etag.noSave, mw.navdata.get ],
'gis/:featuretype(line|point)': {
get: [ mw.gis.navdata.get ]
get: [ mw.etag.noSave, mw.gis.navdata.get ]
}
},
'/info/': {
@@ -329,15 +352,14 @@ app.disable('x-powered-by');
app.enable('trust proxy');
INFO('trust proxy is ' + (app.get('trust proxy')? 'on' : 'off'));
const addr = "127.0.0.1";
if (!module.parent) {
var port = process.env.HTTP_PORT || 3000;
var server = http.createServer(app).listen(port, addr);
const port = process.env.HTTP_PORT || 3000;
const host = process.env.HTTP_HOST || "127.0.0.1";
var server = http.createServer(app).listen(port, host);
INFO('API started on port ' + port);
} else {
app.start = function (port = 3000, path) {
app.start = function (port = 3000, host = "127.0.0.1", path) {
var root = app;
if (path) {
@@ -346,9 +368,9 @@ if (!module.parent) {
root.use(path, app);
}
const server = http.createServer(root).listen(port, addr);
const server = http.createServer(root).listen(port, host);
if (server) {
// console.log(`API started on port ${port}, prefix: ${path || "/"}`);
console.log(`API started on port ${port}, prefix: ${path || "/"}`);
INFO(`API started on port ${port}, prefix: ${path || "/"}`);
}
return server;

View File

@@ -1,4 +1,4 @@
const expressJWT = require('express-jwt');
const {expressjwt: expressJWT} = require('express-jwt');
const cfg = require("../../../lib/config").jwt;
@@ -15,11 +15,12 @@ const options = {
secret: cfg.secret,
credentialsRequired: false,
algorithms: ['HS256'],
requestProperty: "user",
getToken
};
const allow = {
path: [/\/login$/, /\/logout$/],
path: [/\/login$/, /\/logout$/, /\/$/, /\/version$/],
useOriginalUrl: false
};

View File

@@ -33,7 +33,9 @@ function saveResponse (res) {
const cache = getCache(res);
const req = res.req;
console.log(`Saving ETag: ${req.method} ${req.url}${etag}`);
cache[req.url] = {etag, headers: res.getHeaders()};
const headers = structuredClone(res.getHeaders());
delete headers["set-cookie"];
cache[req.url] = {etag, headers};
}
}
};

View File

@@ -43,15 +43,26 @@ const rels = [
matches: [ ],
callback (url, data) {
if (data.payload?.table == "info") {
const pid = data.payload?.pid;
const key = (data.payload?.new ?? data.payload?.old)?.key;
const rx = /^\/project\/([^\/]+)\/info\/([^\/?]+)[\/?]?/;
const match = url.match(rx);
if (match) {
if (match[1] == data.payload.pid) {
if (match[1] == pid) {
if (match[2] == data.payload?.old?.key || match[2] == data.payload?.new?.key) {
return true;
}
}
}
if (key == "plan") {
const rx = /^\/project\/([^\/]+)\/plan[\/?]?/;
const match = url.match(rx);
if (match) {
return match[1] == pid;
}
}
}
return false;
}

View File

@@ -0,0 +1,14 @@
const { event } = require('../../../lib/db');
const json = async function (req, res, next) {
try {
const response = await event.changes(req.params.project, req.params.since, req.query);
res.status(200).send(response);
next();
} catch (err) {
next(err);
}
};
module.exports = json;

View File

@@ -6,5 +6,6 @@ module.exports = {
post: require('./post'),
put: require('./put'),
patch: require('./patch'),
delete: require('./delete')
delete: require('./delete'),
changes: require('./changes')
}

View File

@@ -0,0 +1,85 @@
const { stringify } = require('csv');
const { transform, prepare } = require('../../../../../lib/sse');
const json = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const {events, sequences} = await prepare(req.params.project, query);
if ("download" in query || "d" in query) {
const extension = "csv";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
const columns = {
id: "id",
unix_epoch: (row) => Math.floor(row.tstamp/1000),
timestamp: (row) => (new Date(row.tstamp)).toISOString(),
sequence: "sequence",
point: "point",
text: "remarks",
labels: (row) => row.labels.join(";"),
latitude: (row) => {
if (row.meta.geometry?.type == "Point" && row.meta.geometry?.coordinates) {
return row.meta.geometry.coordinates[1];
}
},
longitude: (row) => {
if (row.meta.geometry?.type == "Point" && row.meta.geometry?.coordinates) {
return row.meta.geometry.coordinates[0];
}
}
};
let fields = [ "timestamp", "sequence", "point", "text", "labels", "latitude", "longitude", "id" ];
if (req.query.fields) {
fields = req.query.fields.split(/[,;:.\s+*|]+/);
}
let delimiter = req.query.delimiter || ",";
const stringifier = stringify({delimiter});
stringifier.on('error', (err) => {
console.error(err.message);
});
stringifier.on('readable', () => {
while((row = stringifier.read()) !== null) {
res.write(row);
}
});
res.status(200);
if (!req.query.header || req.query.header.toLowerCase() == "true" || req.query.header == "1") {
// Send header
stringifier.write(fields);
}
events.forEach( event => {
stringifier.write(fields.map( field => {
if (typeof columns[field] === "function") {
return columns[field](event);
} else {
return event[columns[field]];
}
}));
});
stringifier.end();
res.end();
next();
} catch (err) {
next(err);
}
};
module.exports = json;

View File

@@ -2,6 +2,7 @@ const json = require('./json');
const geojson = require('./geojson');
const seis = require('./seis');
const html = require('./html');
const csv = require('./csv');
const pdf = require('./pdf');
module.exports = async function (req, res, next) {
@@ -11,6 +12,7 @@ module.exports = async function (req, res, next) {
"application/geo+json": geojson,
"application/vnd.seis+json": seis,
"text/html": html,
"text/csv": csv,
"application/pdf": pdf
};

View File

@@ -0,0 +1,29 @@
const files = require('../../../lib/files');
module.exports = async function (req, res, next) {
try {
const entity = await files.get(req.params.path, req.params.project, req.query);
if (entity) {
if (entity.download) {
res.download(...entity.download, (err) => next(err));
} else {
// Directory listing
res.status(203).json(entity);
next();
}
} else {
throw {
status: 404,
code: "ENOENT"
};
}
} catch (err) {
if (err.code == 'ENOENT') {
res.status(404).json({message: err.code});
} else {
next(err);
}
}
};

View File

@@ -0,0 +1,7 @@
module.exports = {
get: require('./get'),
post: require('./post'),
put: require('./put'),
delete: require('./delete')
}

View File

@@ -2,5 +2,6 @@ module.exports = {
bbox: require('./bbox'),
preplot: require('./preplot'),
raw: require('./raw'),
final: require('./final')
final: require('./final'),
layer: require('./layer')
};

View File

@@ -0,0 +1,18 @@
const { gis } = require('../../../../../lib/db');
module.exports = async function (req, res, next) {
try {
const layers = await gis.project.layer.get(req.params.project, req.params.name);
if (req.params.name && (!layers || !layers.length)) {
res.status(404).json({message: "Not found"});
} else {
res.status(200).send(layers ?? []);
}
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,3 @@
module.exports = {
get: require('./get')
};

View File

@@ -1,5 +1,6 @@
module.exports = {
event: require('./event'),
files: require('./files'),
plan: require('./plan'),
line: require('./line'),
project: require('./project'),

View File

@@ -1,10 +1,11 @@
const { label } = require('../../../lib/db');
const { project } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await label.list(req.params.project, req.query));
const labels = (await project.configuration.get(req.params.project))?.labels ?? {};
res.status(200).send(labels);
next();
} catch (err) {
next(err);

View File

@@ -1,9 +1,14 @@
const { plan } = require('../../../../lib/db');
const { plan, info } = require('../../../../lib/db');
const json = async function (req, res, next) {
try {
const response = await plan.list(req.params.project, req.query);
const sequences = await plan.list(req.params.project, req.query) ?? [];
const remarks = await info.get(req.params.project, "plan/remarks", req.query, req.user.role) ?? null;
const response = {
remarks,
sequences
};
res.status(200).send(response);
next();
} catch (err) {

View File

@@ -0,0 +1,13 @@
const { project } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await project.configuration.get(req.params.project));
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,8 @@
module.exports = {
get: require('./get'),
// post: require('./post'),
// put: require('./put'),
patch: require('./patch'),
// delete: require('./delete'),
};

View File

@@ -0,0 +1,16 @@
const { project } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
// TODO
// Implement If-Match header requirements
res.send(await project.configuration.patch(req.params.project, req.body));
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,15 @@
const { project } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
await project.delete(req.params.project)
res.status(204).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -4,10 +4,11 @@ const { project} = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await project.get(req.params.project));
res.status(200).send(await project.get());
next();
} catch (err) {
next(err);
}
};

View File

@@ -1,4 +1,7 @@
module.exports = {
list: require('./list'),
get: require('./get')
get: require('./get'),
post: require('./post'),
delete: require('./delete'),
summary: require('./summary'),
configuration: require('./configuration'),
};

View File

@@ -1,14 +0,0 @@
const { project} = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await project.list());
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,16 @@
const { project } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
const projectDefinition = await project.post(payload);
res.status(201).send(projectDefinition);
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,13 @@
const { project } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await project.summary.get(req.params.project));
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,3 @@
module.exports = {
get: require('./get'),
};

View File

@@ -9,105 +9,16 @@ const { ALERT, ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
* the last shot and first shot of the previous and current dates, respectively.
*/
class DetectFDSP {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks at the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
author = `*${this.constructor.name}*`;
prev = null;
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
const sequence = Number(cur._sequence);
try {
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus == "online" && cur.lineStatus == "online" && sequence) {
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
if (prev.time.substr(0, 10) != cur.time.substr(0, 10)) {
// Possible a date change, but could also be a missing timestamp
// or something else.
const ts0 = new Date(prev.time)
const ts1 = new Date(cur.time);
if (!isNaN(ts0) && !isNaN(ts1) && ts0.getUTCDay() != ts1.getUTCDay()) {
INFO("Sequence shot across midnight UTC detected", cur._sequence, cur.lineName);
const ldsp = {
sequence: prev._sequence,
point: prev._point,
remarks: "Last shotpoint of the day",
labels: ["LDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
};
const fdsp = {
sequence: cur._sequence,
point: cur._point,
remarks: "First shotpoint of the day",
labels: ["FDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
};
INFO("LDSP", ldsp);
INFO("FDSP", fdsp);
const projectId = await schema2pid(prev._schema);
if (projectId) {
await event.post(projectId, ldsp);
await event.post(projectId, fdsp);
} else {
ERROR("projectId not found for", prev._schema);
}
} else {
WARNING("False positive on these timestamps", prev.time, cur.time);
WARNING("No events were created");
}
}
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
ERROR(err);
} finally {
cur.isPending = false;
}
}
constructor () {
DEBUG(`${this.author} instantiated`);
}
async run (data) {
async run (data, ctx) {
if (!data || data.channel !== "realtime") {
return;
}
@@ -116,27 +27,70 @@ class DetectFDSP {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectFDSP.MAX_QUEUE_SIZE) {
const event = {
isPending: this.queue.length,
_schema: meta._schema,
time: meta.time,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName
};
this.queue.push(event);
// DEBUG("EVENT", event);
} else {
ALERT("Queue full at", this.queue.length);
if (!this.prev) {
DEBUG("Initialising `prev`");
this.prev = data;
return;
}
this.processQueue();
try {
DEBUG("Running");
const cur = data;
const sequence = Number(cur._sequence);
if (this.prev.lineName == cur.lineName && this.prev._sequence == cur._sequence &&
this.prev.lineStatus == "online" && cur.lineStatus == "online" && sequence) {
if (this.prev.time.substr(0, 10) != cur.time.substr(0, 10)) {
// Possibly a date change, but could also be a missing timestamp
// or something else.
const ts0 = new Date(this.prev.time)
const ts1 = new Date(cur.time);
if (!isNaN(ts0) && !isNaN(ts1) && ts0.getUTCDay() != ts1.getUTCDay()) {
INFO("Sequence shot across midnight UTC detected", cur._sequence, cur.lineName);
const ldsp = {
sequence: this.prev._sequence,
point: this.prev._point,
remarks: "Last shotpoint of the day",
labels: ["LDSP", "Prod"],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
const fdsp = {
sequence: cur._sequence,
point: cur._point,
remarks: "First shotpoint of the day",
labels: ["FDSP", "Prod"],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
INFO("LDSP", ldsp);
INFO("FDSP", fdsp);
const projectId = await schema2pid(this.prev._schema);
if (projectId) {
await event.post(projectId, ldsp);
await event.post(projectId, fdsp);
} else {
ERROR("projectId not found for", this.prev._schema);
}
} else {
WARNING("False positive on these timestamps", this.prev.time, cur.time);
WARNING("No events were created");
}
}
}
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
} finally {
this.prev = data;
}
}
}

View File

@@ -0,0 +1,60 @@
const project = require('../../lib/db/project');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectProjectConfigurationChange {
author = `*${this.constructor.name}*`;
constructor (ctx) {
DEBUG(`${this.author} instantiated`);
// Grab project configurations.
// NOTE that this will run asynchronously
this.run({channel: "project"}, ctx);
}
async run (data, ctx) {
if (!data || data.channel !== "project") {
return;
}
// Project notifications, as of this writing, most likely
// do not carry payloads as those exceed the notification
// size limit.
// For our purposes, we do not care as we just re-read all
// the configurations for all non-archived projects.
try {
DEBUG("Project configuration change detected")
const projects = await project.get();
const _ctx_data = {};
for (let pid of projects.map(i => i.pid)) {
DEBUG("Retrieving configuration for", pid);
const cfg = await project.configuration.get(pid);
if (cfg?.archived === true) {
DEBUG(pid, "is archived. Ignoring");
continue;
}
DEBUG("Saving configuration for", pid);
_ctx_data[pid] = cfg;
}
if (! ("projects" in ctx)) {
ctx.projects = {};
}
ctx.projects.configuration = _ctx_data;
DEBUG("Committed project configuration to ctx.projects.configuration");
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
}
}
}
module.exports = DetectProjectConfigurationChange;

Some files were not shown because too many files have changed in this diff Show More