Compare commits

...

101 Commits

Author SHA1 Message Date
D. Berge
6eccbf215a There should be no need to await.
That is because the queue handler will, in theory, only ever
process one event at a time.
2023-09-30 21:29:15 +02:00
D. Berge
8abc05f04e Remove dead code 2023-09-30 21:29:15 +02:00
D. Berge
8f587467f9 Add comment 2023-09-30 21:29:15 +02:00
D. Berge
3d7a91c7ff Rewrite ReportLineChangeTime 2023-09-30 21:29:15 +02:00
D. Berge
3fd408074c Support passing array in opts.sequences to event.list() 2023-09-30 21:29:15 +02:00
D. Berge
f71cbd8f51 Add unique utility function 2023-09-30 21:29:15 +02:00
D. Berge
915df8ac16 Add handler for creation of line change time events 2023-09-30 21:29:15 +02:00
D. Berge
d5ecb08a2d Allow switching to event entry by time.
A ‘Timed’ button is shown when a new (not edited) event is in
the event entry dialogue and the event has sequence and/or
point values. Pressing the button deletes the sequence/point
information and sets the date and time fields to current time.

Fixes #277.
2023-09-30 21:26:32 +02:00
D. Berge
9388cd4861 Make daily_tasks work with new project configuration 2023-09-30 20:36:46 +02:00
D. Berge
180590b411 Mark events as being automatically generated 2023-09-30 01:42:27 +02:00
D. Berge
4ec37539bf Add utils to work with Postgres ranges 2023-09-30 01:41:45 +02:00
D. Berge
8755fe01b6 Refactor events.list.
The SQL has been simplified and the following changes made:

- The `sequence` argument now can only take one individual
  sequence, not a list of sequences.
- A new `sequences` argument is recognised. It takes a list
  of sequences (as a string).
- A new `label` argument is recognised. It takes a label
  name and returns events containing that label.
- A new `jpq` argument is recognised. It takes a JSONPath
  string which is applied to `meta` with jsonb_path_exists(),
  returning any events for which the JSON path expression
  matches.
2023-09-30 01:37:22 +02:00
D. Berge
0bfe54e0c2 Include the meta attribute when posting events 2023-09-30 01:36:18 +02:00
D. Berge
29bc689b84 Merge branch '276-add-soft-start-event-detection' into 'devel'
Resolve "Add soft start event detection"

Closes #276

See merge request wgp/dougal/software!44
2023-09-29 15:02:57 +00:00
D. Berge
65682febc7 Add soft start and full volume events detection 2023-09-29 17:02:03 +02:00
D. Berge
d408665d62 Write meta info to automatic events 2023-09-29 16:49:27 +02:00
D. Berge
64fceb0a01 Merge branch '127-sol-eol-events-not-being-inserted-in-the-log-automatically' into 'devel'
Resolve "SOL / EOL events not being inserted in the log automatically"

Closes #127

See merge request wgp/dougal/software!43
2023-09-29 14:17:46 +00:00
D. Berge
ab58e578c9 Use DEBUG library throughout 2023-09-29 16:16:33 +02:00
D. Berge
0e58b8fa5b Refactor code to identify candidate schemas.
As part of the refactoring, we took into account a slight payload
format change (project configuration details are under the `data`
attribute).
2023-09-29 16:13:35 +02:00
D. Berge
99ac082f00 Use common naming convention both online and offline 2023-09-29 16:11:44 +02:00
D. Berge
4d3fddc051 Merge branch '274-use-new-db-event-notifier-for-event-processing-handlers' into 'devel'
Resolve "Use new DB event notifier for event processing handlers"

Closes #275, #230, and #274

See merge request wgp/dougal/software!42
2023-09-29 14:03:00 +00:00
D. Berge
42456439a9 Remove ad-hoc notifier 2023-09-29 15:59:12 +02:00
D. Berge
ee0c0e7308 Replace ad-hoc notifier with pg-listen based version 2023-09-29 15:59:12 +02:00
D. Berge
998c272bf8 Add var/* to .gitignore 2023-09-29 15:59:12 +02:00
D. Berge
daddd1f0e8 Add script to rewrite packet captures IP and MAC addresses.
Closes #230.
2023-09-29 15:58:59 +02:00
D. Berge
17f20535cb Cope with fragmented UDP packets.
Fixes #275.

Use this as the systemd unit file to run as a service:

[Unit]
Description=Dougal Network Packet Capture
After=network.target remote-fs.target nss-lookup.target

[Service]
ExecStart=/srv/dougal/software/sbin/packet-capture.sh
ExecStop=/bin/kill -s QUIT $MAINPID
Restart=always
User=root
Group=users
Environment=PATH=/usr/bin:/usr/sbin:/usr/local/bin
Environment=INS_HOST=172.31.10.254
WorkingDirectory=/srv/dougal/software/var/
SyslogIdentifier=dougal.pcap

[Install]
WantedBy=multi-user.target
2023-09-29 15:28:11 +02:00
D. Berge
0829ea3ea1 Save a copy of the headers not the original.
Otherwise ExpressJS will complain about trying to modify
headers that have already been sent.
2023-09-24 12:17:16 +02:00
D. Berge
2069d9c3d7 Remove dead code 2023-09-24 12:15:06 +02:00
D. Berge
8a2d526c50 Ignore schema attribute in PATCH payload.
Fixes #273.
2023-09-24 12:14:20 +02:00
D. Berge
8ad96d6f73 Ensure that requiredFields is always defined.
Otherwise, `Object.entries(requiredFields)` may fail.
2023-09-24 11:59:26 +02:00
D. Berge
947faf8c05 Provide default glob specification for map layer imports 2023-09-24 11:34:10 +02:00
D. Berge
a948556455 Fail gracefully if map layer data does not exist.
Fixes #272.
2023-09-24 11:33:32 +02:00
D. Berge
835384b730 Apply path conversion to QC definition files 2023-09-23 22:50:09 +02:00
D. Berge
c5b93794f4 Move path conversion to general utilities 2023-09-23 13:44:53 +02:00
D. Berge
056cd32f0e Merge branch '271-qc-results-not-being-refreshed' into 'devel'
Resolve "QC results not being refreshed"

Closes #271

See merge request wgp/dougal/software!41
2023-09-18 10:08:35 +00:00
D. Berge
49bb413110 Merge branch '270-real-time-interface-stopped-working' into 'devel'
Resolve "Real-time interface stopped working"

Closes #270

See merge request wgp/dougal/software!40
2023-09-18 10:08:27 +00:00
D. Berge
ceccc42050 Don't cache response ETags for QC endpoints 2023-09-18 12:06:38 +02:00
D. Berge
aa3379e1c6 Adapt RTI save function to refactored project configuration in DB 2023-09-18 11:58:55 +02:00
D. Berge
4063af0e25 Merge branch '268-inline-crossline-errors-no-longer-being-calculated' into 'devel'
Resolve "Inline/crossline errors no longer being calculated"

Closes #268

See merge request wgp/dougal/software!39
2023-09-15 18:03:51 +00:00
D. Berge
d53e6060a4 Update database templates to v0.4.2 2023-09-15 20:01:54 +02:00
D. Berge
85d8fc8cc0 Update required database version 2023-09-15 14:22:22 +02:00
D. Berge
0fe40b1839 Add missing require 2023-09-15 14:22:02 +02:00
D. Berge
21de4b757f Add database upgrade file 29. 2023-09-15 12:52:42 +02:00
D. Berge
96cdbb2cff Add database upgrade file 28. 2023-09-15 12:52:27 +02:00
D. Berge
d531643b58 Add database upgrade file 27. 2023-09-15 12:52:06 +02:00
D. Berge
a1779ef488 Do not cache /navdata endpoint responses 2023-09-14 13:20:16 +02:00
D. Berge
5239dece1e Do not cache GIS endpoint responses 2023-09-14 13:19:57 +02:00
D. Berge
a7d7837816 Allow only admins to patch project configurations 2023-09-14 13:19:16 +02:00
D. Berge
ebcfc7df47 Allow everyone to access project configuration.
This is necessary as it is requested by various parts of the
frontend.

Consider more granular access control.
2023-09-14 13:17:28 +02:00
D. Berge
dc4b9002fe Adapt QC endpoints to new configuration APIs 2023-09-14 13:15:59 +02:00
D. Berge
33618b6b82 Do not cache Set-Cookie headers 2023-09-14 13:13:47 +02:00
D. Berge
597d407acc Adapt QC view to new label payload from API 2023-09-14 13:13:18 +02:00
D. Berge
6162a5bdee Stop importing P1/90s until scripts are upgraded.
See #266.
2023-09-14 13:09:38 +02:00
D. Berge
696bbf7a17 Take etc/config.yaml out of revision control.
This file contains site-specific configuration. Instead, an
example config.yaml is now provided.
2023-09-14 13:07:33 +02:00
D. Berge
821fcf0922 Add wx forecast info to plan (experiment).
Use https://open-meteo.com/ as a weather forecast provider.

This code is intended for demonstration only, not for
production purposes.

(issue #157)


(cherry picked from commit cc4bce1356)
2023-09-13 20:04:15 +00:00
D. Berge
b1712d838f Merge branch '245-export-event-log-as-csv' into 'devel'
Resolve "Export event log as CSV"

Closes #245

See merge request wgp/dougal/software!38
2023-09-13 20:02:07 +00:00
D. Berge
895b865505 Expose CSV output option in user interface 2023-09-13 21:59:57 +02:00
D. Berge
5a2af5c49e Add CSV output option for events log 2023-09-13 21:58:06 +02:00
D. Berge
24658f4017 Allow patching project name if no name is already set 2023-09-13 16:13:43 +02:00
D. Berge
6707cda75e Ignore case when patching configuration ID 2023-09-13 16:13:12 +02:00
D. Berge
1302a31b3d Improve formatting of layer alert 2023-09-13 13:00:19 +02:00
D. Berge
871a1e8f3a Don't show alert if layer is empty (but log to console) 2023-09-13 12:59:47 +02:00
D. Berge
04e1144bab Simplify expression 2023-09-13 12:59:24 +02:00
D. Berge
6312d94f3e Add support for user layer tooltips and popups 2023-09-13 12:58:44 +02:00
D. Berge
ed91026319 Add tolltip and popup options to map layer configuration.
- `tooltip` takes the name of a GeoJSON property that will be
  shown in a tooltip when hovering the mouse over a feature.

- `popup` can take either the name of a property as above, or
  the boolean value `true`. In the latter case, a table of all
  the feature's properties will be shown when clicking on the
  feature. In the former case, only the value of the designated
  property will be shown.
2023-09-13 12:55:37 +02:00
D. Berge
441a4e296d Import map layers from the runner 2023-09-13 11:24:04 +02:00
D. Berge
c33c3f61df Alert the user if a map layer is too big 2023-09-13 11:22:49 +02:00
D. Berge
2cc293b724 Do not fail trying to restore state for non-existing layers 2023-09-13 11:22:05 +02:00
D. Berge
ee129b2faa Merge branch '114-allow-users-to-show-arbitrary-geojson-on-the-map' into 'devel'
Resolve "Allow users to show arbitrary GeoJSON on the map."

Closes #114

See merge request wgp/dougal/software!37
2023-09-12 17:34:51 +00:00
D. Berge
98d9b3b093 Adapt Map view to new label payload from API 2023-09-12 19:31:58 +02:00
D. Berge
57b9b420f8 Show an error if a layer is too large.
The map view limits the size of layers (both user and regular) in
order to keep the system responsive, as Leaflet is not great at
handling large layers.
2023-09-12 19:29:02 +02:00
D. Berge
9e73f2603a Implement user layers on map view.
The user layers are defined in the project configuration under
`imports.map.layers`.

Multiple layers may be defined and each layer may consist of one
or more GeoJSON files. Files are retrieved via the /files/ API
endpoint.
2023-09-12 19:29:02 +02:00
D. Berge
707889be42 Refactor layer API endpoint and database functions.
- A single get() function is used both to list all available
  layers, if no layer name is given, or a single layer.
- The database no longer holds the actual layer contents,
  only the path to the layer file(s), so the list() function
  is now redundant as we return the full payload in every case.
- The /gis/layer and /gis/layer/:name endpoints now have the same
  payload structure.
2023-09-12 19:29:02 +02:00
D. Berge
f9a70e0145 Refactor map layer importer.
- Now a layer may consist of a path pointing to a directory plus a
  glob, or a path pointing directly to a single file.
- If a file already exists in the database, check if the layer
  name has changed and if so, update it.
- Do not import the actual file contents, as the path is enough
  (it can be retrieved via the /file/:path API endpoint).
2023-09-12 11:05:10 +02:00
D. Berge
b71489cee1 Add get_file_data() function to datastore 2023-09-12 11:04:37 +02:00
D. Berge
0a9bde5f10 Add Background layer to map.
This is a limited implementation of layer backgrounds. The API
supports an arbitrary number of arbitrarily named background
layers, but for the time being we only recognise one background
layer named `Background` and of GeoJSON type.

Certain properties, such a colour/color, opacity, etc., are
recognised and applied as feature styles. If not, a default
style is used.
2023-09-11 10:17:10 +02:00
D. Berge
36d5862375 Add map layer middleware and API endpoints 2023-09-11 10:15:19 +02:00
D. Berge
398c702004 Add map layer functions to database interface 2023-09-11 10:12:46 +02:00
D. Berge
b2d1798338 Add map layer importer 2023-09-11 10:00:59 +02:00
D. Berge
4f165b0c83 Revert behaviour of new jwt-express version.
Fixes breakage introduced in commit
cd00f8b995.
2023-09-10 14:09:01 +02:00
D. Berge
2c86944a51 Merge branch '262-preset-remarks-and-labels-no-longer-working-with-api-0-4-0' into 'devel'
Resolve "Preset remarks and labels no longer working with API 0.4.0"

Closes #262

See merge request wgp/dougal/software!36
2023-09-10 10:10:22 +00:00
D. Berge
5fc51de7d8 Adapt Log view to new configuration endpoint in the API 2023-09-10 12:01:59 +02:00
D. Berge
158e0fb788 Adapt Log view to new label payload from API 2023-09-10 12:01:30 +02:00
D. Berge
941d15c1bc Return labels directly from project configuration.
NOTE: This is a breaking API change. Before this it returned an
array of labels, now it returns an object.
2023-09-10 11:59:38 +02:00
D. Berge
cd00f8b995 Breaking-change Node package udpates (server) 2023-09-10 11:49:56 +02:00
D. Berge
44515f8e78 Non-breaking Node package updates (server) 2023-09-09 20:54:04 +02:00
D. Berge
54fbc76da5 Merge branch '261-wrong-missing-shots-value-in-sequence-summary' into 'devel'
Resolve "Wrong missing shots value in sequence summary"

Closes #261

See merge request wgp/dougal/software!35
2023-09-09 18:46:33 +00:00
D. Berge
c1b5196134 Update database templates to v0.3.12.
Incorporates fix for bug #261.
2023-09-09 20:45:11 +02:00
D. Berge
fb3d3be546 Trailing slash in API call results in "unauthorised" error.
No idea why.
2023-09-09 20:39:49 +02:00
D. Berge
8e11e242ed Remove NODE_OPTIONS from scripts.
Node version 18 does not seem to like it.
2023-09-09 20:37:08 +02:00
D. Berge
8a815ce3ef Add database upgrade file 26. 2023-09-09 20:23:20 +02:00
D. Berge
91076a50ad Show API error messages if available 2023-09-09 17:00:32 +02:00
D. Berge
e624dcdde0 Support async API callbacks in Vuex action 2023-09-09 16:59:43 +02:00
D. Berge
a25676122c Update material design icons dependency 2023-09-09 16:58:44 +02:00
D. Berge
e4dfbe2c9a Update minimum node version to 18 2023-09-09 16:57:20 +02:00
D. Berge
78fb34d049 Update the API version number 2023-09-09 16:56:52 +02:00
D. Berge
38c4125f4f Support patching values out of the configuration.
A configuration patch having keys with null values will result
in those keys being removed from the configuration.
2023-09-09 16:53:42 +02:00
D. Berge
04d6cbafe3 Use refactored database API in QC executable 2023-09-09 16:42:30 +02:00
D. Berge
e6319172d8 Fix typo in QC executable 2023-09-09 16:42:00 +02:00
D. Berge
5230ff63e3 Use new database API calls for configuration 2023-09-09 16:39:53 +02:00
D. Berge
2b364bbff7 Make bin script compatible with Python 3.6 2023-09-09 16:38:51 +02:00
61 changed files with 2618 additions and 564 deletions

2
.gitignore vendored
View File

@@ -11,3 +11,5 @@ lib/www/client/dist/
etc/surveys/*.yaml
!etc/surveys/_*.yaml
etc/ssl/*
etc/config.yaml
var/*

View File

@@ -12,6 +12,18 @@ surveys should be under $HOME/etc/surveys/*.yaml. In both cases,
$HOME is the home directory of the user running this script.
"""
def is_relative_to(it, other):
"""
is_relative_to() is not present version Python 3.9, so we
need this kludge to get Dougal to run on OpenSUSE 15.4
"""
if "is_relative_to" in dir(it):
return it.is_relative_to(other)
return str(it.absolute()).startswith(str(other.absolute()))
prefix = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
DOUGAL_ROOT = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
@@ -142,7 +154,7 @@ def untranslate_path (file):
if filepath.is_absolute():
if type(import_paths) == str:
if filepath.is_relative_to(import_paths):
if is_relative_to(filepath, import_paths):
physical_root = pathlib.Path("/")
physical_prefix = pathlib.Path(import_paths)
return str(root.joinpath(filepath.relative_to(physical_prefix)))
@@ -152,7 +164,7 @@ def untranslate_path (file):
for key, value in import_paths.items():
value = dougal_root.joinpath(value)
physical_prefix = pathlib.Path(value)
if filepath.is_relative_to(physical_prefix):
if is_relative_to(filepath, physical_prefix):
logical_prefix = physical_root.joinpath(pathlib.Path(key)).resolve()
return str(logical_prefix.joinpath(filepath.relative_to(physical_prefix)))

View File

@@ -11,11 +11,9 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:

View File

@@ -589,7 +589,33 @@ class Datastore:
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def get_file_data(self, path, cursor = None):
"""
Retrieve arbitrary data associated with a file.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
realpath = configuration.translate_path(path)
hash = file_hash(realpath)
qry = """
SELECT data
FROM file_data
WHERE hash = %s;
"""
cur.execute(qry, (hash,))
res = cur.fetchone()
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
return res[0]
def surveys (self, include_archived = False):
"""

View File

@@ -9,11 +9,9 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:

127
bin/import_map_layers.py Executable file
View File

@@ -0,0 +1,127 @@
#!/usr/bin/python3
"""
Import SmartSource data.
For each survey in configuration.surveys(), check for new
or modified final gun header files and (re-)import them into the
database.
"""
import os
import sys
import pathlib
import re
import time
import json
import configuration
from datastore import Datastore
if __name__ == '__main__':
"""
Imports map layers from the directories defined in the configuration object
`import.map.layers`. The content of that key is an object with the following
structure:
{
layer1Name: [
format: "geojson",
path: "", // Logical path to a directory
globs: [
"**/*.geojson", // List of globs matching map data files
]
],
layer2Name: …
}
"""
def process (layer_name, layer, physical_filepath):
physical_filepath = str(physical_filepath)
logical_filepath = configuration.untranslate_path(physical_filepath)
print(f"Found {logical_filepath}")
if not db.file_in_db(logical_filepath):
age = time.time() - os.path.getmtime(physical_filepath)
if age < file_min_age:
print("Skipping file because too new", logical_filepath)
return
print("Importing")
file_info = {
"type": "map_layer",
"format": layer["format"],
"name": layer_name,
"tooltip": layer.get("tooltip"),
"popup": layer.get("popup")
}
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
file_info = db.get_file_data(logical_filepath)
dirty = False
if file_info:
if file_info["name"] != layer_name:
print("Renaming to", layer_name)
file_info["name"] = layer_name
dirty = True
if file_info.get("tooltip") != layer.get("tooltip"):
print("Changing tooltip to", layer.get("tooltip") or "null")
file_info["tooltip"] = layer.get("tooltip")
dirty = True
if file_info.get("popup") != layer.get("popup"):
print("Changing popup to", layer.get("popup") or "null")
file_info["popup"] = layer.get("popup")
dirty = True
if dirty:
db.save_file_data(logical_filepath, json.dumps(file_info))
else:
print("Already in DB")
print("Reading configuration")
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
try:
map_layers = survey["imports"]["map"]["layers"]
except KeyError:
print("No map layers defined")
continue
for layer_name, layer_items in map_layers.items():
for layer in layer_items:
fileprefix = layer["path"]
realprefix = configuration.translate_path(fileprefix)
if os.path.isfile(realprefix):
process(layer_name, layer, realprefix)
elif os.path.isdir(realprefix):
if not "globs" in layer:
layer["globs"] = [ "**/*.geojson" ]
for globspec in layer["globs"]:
for physical_filepath in pathlib.Path(realprefix).glob(globspec):
process(layer_name, layer, physical_filepath)
print("Done")

View File

@@ -132,18 +132,21 @@ run $BINDIR/import_preplots.py
print_log "Import raw P1/11"
run $BINDIR/import_raw_p111.py
print_log "Import raw P1/90"
run $BINDIR/import_raw_p190.py
#print_log "Import raw P1/90"
#run $BINDIR/import_raw_p190.py
print_log "Import final P1/11"
run $BINDIR/import_final_p111.py
print_log "Import final P1/90"
run $BINDIR/import_final_p190.py
#print_log "Import final P1/90"
#run $BINDIR/import_final_p190.py
print_log "Import SmartSource data"
run $BINDIR/import_smsrc.py
print_log "Import map user layers"
run $BINDIR/import_map_layers.py
# if [[ -z "$RUNNER_NOEXPORT" ]]; then
# print_log "Export system data"
# run $BINDIR/system_exports.py

View File

@@ -1,5 +1,5 @@
\connect dougal
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.12"}')
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.12"}' WHERE public.info.key = 'version';
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 14.2
-- Dumped by pg_dump version 14.2
-- Dumped from database version 14.8
-- Dumped by pg_dump version 14.9
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -70,173 +70,171 @@ If the path matches that of an existing entry, delete that entry (which cascades
CREATE PROCEDURE _SURVEY__TEMPLATE_.adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT data->'planner'
INTO _planner_config
FROM file_data
WHERE data ? 'planner';
SELECT project_configuration()->'planner'
INTO _planner_config;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END IF;
END;
END;
$$;
@@ -367,8 +365,8 @@ COMMENT ON PROCEDURE _SURVEY__TEMPLATE_.augment_event_data(IN maxspan numeric) I
CREATE FUNCTION _SURVEY__TEMPLATE_.binning_parameters() RETURNS jsonb
LANGUAGE sql STABLE LEAKPROOF PARALLEL SAFE
AS $$
SELECT data->'binning' binning FROM file_data WHERE data->>'binning' IS NOT NULL LIMIT 1;
$$;
SELECT project_configuration()->'binning' binning;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.binning_parameters() OWNER TO postgres;
@@ -1042,6 +1040,39 @@ ALTER PROCEDURE _SURVEY__TEMPLATE_.log_midnight_shots(IN dt0 date, IN dt1 date)
COMMENT ON PROCEDURE _SURVEY__TEMPLATE_.log_midnight_shots(IN dt0 date, IN dt1 date) IS 'Add midnight shots between two dates dt0 and dt1 to the event_log, unless the events already exist.';
--
-- Name: project_configuration(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.project_configuration() RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
schema_name text;
configuration jsonb;
BEGIN
SELECT nspname
INTO schema_name
FROM pg_namespace
WHERE oid = (
SELECT pronamespace
FROM pg_proc
WHERE oid = 'project_configuration'::regproc::oid
);
SELECT meta
INTO configuration
FROM public.projects
WHERE schema = schema_name;
RETURN configuration;
END
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.project_configuration() OWNER TO postgres;
--
-- Name: replace_placeholders(text, timestamp with time zone, integer, integer); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -1488,9 +1519,9 @@ CREATE VIEW _SURVEY__TEMPLATE_.final_lines_summary AS
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
(( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.preplot_points
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.missing_sequence_final_points
WHERE missing_sequence_final_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
@@ -2137,9 +2168,9 @@ CREATE VIEW _SURVEY__TEMPLATE_.raw_lines_summary AS
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(( SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.preplot_points
WHERE ((preplot_points.line = rl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_preplots) AS missing_shots,
(SELECT count(*) AS count
FROM _SURVEY__TEMPLATE_.missing_sequence_raw_points
WHERE missing_sequence_raw_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,

View File

@@ -0,0 +1,162 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.3.13
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Fixes a bug in the `final_lines_summary` and `raw_lines_summary` views
-- which results in the number of missing shots being miscounted on jobs
-- using three sources.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(SELECT count(*) AS count
FROM missing_sequence_raw_points
WHERE missing_sequence_raw_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
( SELECT count(*) AS count
FROM missing_sequence_final_points
WHERE missing_sequence_final_points.sequence = s.sequence) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.13"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.13"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,122 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adapts the schema to the change in how project configurations are
-- handled (https://gitlab.com/wgp/dougal/software/-/merge_requests/29)
-- by creating a project_configuration() function which returns the
-- current project's configuration data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION project_configuration()
RETURNS jsonb
LANGUAGE plpgsql
AS $$
DECLARE
schema_name text;
configuration jsonb;
BEGIN
SELECT nspname
INTO schema_name
FROM pg_namespace
WHERE oid = (
SELECT pronamespace
FROM pg_proc
WHERE oid = 'project_configuration'::regproc::oid
);
SELECT meta
INTO configuration
FROM public.projects
WHERE schema = schema_name;
RETURN configuration;
END
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.3.12' AND current_db_version != '0.3.13' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.0"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,264 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies adjust_planner() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE adjust_planner()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT project_configuration()->'planner'
INTO _planner_config;
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM event_log
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.0' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.1"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Fix wrong number of missing shots in summary views
--
-- New schema version: 0.4.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This modifies binning_parameters() to use project_configuration()
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION binning_parameters() RETURNS jsonb
LANGUAGE sql STABLE LEAKPROOF PARALLEL SAFE
AS $$
SELECT project_configuration()->'binning' binning;
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.1' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.2"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -9,7 +9,7 @@
"version": "0.0.0",
"license": "UNLICENSED",
"dependencies": {
"@mdi/font": "^5.6.55",
"@mdi/font": "^7.2.96",
"core-js": "^3.6.5",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",
@@ -1763,9 +1763,9 @@
}
},
"node_modules/@mdi/font": {
"version": "5.9.55",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-5.9.55.tgz",
"integrity": "sha512-jswRF6q3eq8NWpWiqct6q+6Fg/I7nUhrxYJfiEM8JJpap0wVJLQdbKtyS65GdlK7S7Ytnx3TTi/bmw+tBhkGmg=="
"version": "7.2.96",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-7.2.96.tgz",
"integrity": "sha512-e//lmkmpFUMZKhmCY9zdjRe4zNXfbOIJnn6xveHbaV2kSw5aJ5dLXUxcRt1Gxfi7ZYpFLUWlkG2MGSFAiqAu7w=="
},
"node_modules/@mrmlnc/readdir-enhanced": {
"version": "2.2.1",
@@ -16442,9 +16442,9 @@
}
},
"@mdi/font": {
"version": "5.9.55",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-5.9.55.tgz",
"integrity": "sha512-jswRF6q3eq8NWpWiqct6q+6Fg/I7nUhrxYJfiEM8JJpap0wVJLQdbKtyS65GdlK7S7Ytnx3TTi/bmw+tBhkGmg=="
"version": "7.2.96",
"resolved": "https://registry.npmjs.org/@mdi/font/-/font-7.2.96.tgz",
"integrity": "sha512-e//lmkmpFUMZKhmCY9zdjRe4zNXfbOIJnn6xveHbaV2kSw5aJ5dLXUxcRt1Gxfi7ZYpFLUWlkG2MGSFAiqAu7w=="
},
"@mrmlnc/readdir-enhanced": {
"version": "2.2.1",

View File

@@ -3,11 +3,11 @@
"version": "0.0.0",
"private": true,
"scripts": {
"serve": "NODE_OPTIONS=--openssl-legacy-provider vue-cli-service serve --host=0.0.0.0",
"build": "NODE_OPTIONS=--openssl-legacy-provider vue-cli-service build"
"serve": "vue-cli-service serve --host=0.0.0.0",
"build": "vue-cli-service build"
},
"dependencies": {
"@mdi/font": "^5.6.55",
"@mdi/font": "^7.2.96",
"core-js": "^3.6.5",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",

View File

@@ -44,7 +44,7 @@
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
@@ -64,7 +64,7 @@
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
@@ -256,6 +256,15 @@
>
Cancel
</v-btn>
<v-btn v-if="!id && (entrySequence || entryPoint)"
color="info"
text
title="Enter an event by time"
@click="timed"
>
<v-icon left small>mdi-clock-outline</v-icon>
Timed
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
@@ -632,6 +641,14 @@ export default {
}
},
timed () {
const tstamp = (new Date()).toISOString();
this.entrySequence = null;
this.entryPoint = null;
this.tsDate = tstamp.substr(0, 10);
this.tsTime = tstamp.substr(11, 8);
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);

View File

@@ -16,7 +16,7 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
const url = /^https?:\/\//i.test(resource) ? resource : (state.apiUrl + resource);
const res = await fetch(url, init);
if (typeof cb === 'function') {
cb(null, res);
await cb(null, res);
}
if (res.ok) {
@@ -35,7 +35,14 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
throw err;
}
} else {
await dispatch('showSnack', [res.statusText, "warning"]);
let message = res.statusText;
if (res.headers.get("Content-Type").match(/^application\/json/i)) {
const body = await res.json();
if (body.message) {
message = body.message;
}
}
await dispatch('showSnack', [message, "warning"]);
}
} catch (err) {
if (err && err.name == "AbortError") return;

View File

@@ -72,6 +72,10 @@
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=application%2Fjson`"
title="Download as a generic JSON file"
>JSON</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=text%2Fcsv`"
title="Download as Comma Separated Values file"
>CSV</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${$route.params.sequence}?mime=text%2Fhtml`"
title="Download as an HTML formatted file"
@@ -604,16 +608,16 @@ export default {
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
//const labelSet = {};
this.labels = await this.api([url]) ?? {};
//labels.forEach( l => labelSet[l.name] = l.data );
//this.labels = labelSet;
},
async getPresetRemarks () {
const url = `/project/${this.$route.params.project}/configuration/events/presetRemarks`;
const url = `/project/${this.$route.params.project}/configuration`;
this.presetRemarks = await this.api([url]);
this.presetRemarks = (await this.api([url]))?.events?.presetRemarks ?? {};
},
newItem (from = {}) {

View File

@@ -375,7 +375,8 @@ export default {
}
],
labels: {},
hashMarker: null
hashMarker: null,
references: {}
};
},
@@ -474,7 +475,7 @@ export default {
bounds._northEast.lng,
bounds._northEast.lat
].map(i => i.toFixed(bboxScale)).join(",");
const limit = 10000;
const limit = 10000; // Empirical value
const query = new URLSearchParams({bbox, limit});
@@ -511,7 +512,9 @@ export default {
}
l.layer.clearLayers();
if (layer instanceof L.Layer || (layer.features && layer.features.length < limit) || ("length" in layer && layer.length < limit)) {
//if (layer instanceof L.Layer || (layer.features && layer.features.length < limit) || ("length" in layer && layer.length < limit)) {
if (layer instanceof L.Layer || ((layer.features?.length ?? layer?.length) < limit)) {
if (l.layer.addData) {
l.layer.addData(layer);
} else if (l.layer.addLayer) {
@@ -519,8 +522,12 @@ export default {
}
l.layer.lastRequestURL = url;
} else if (!layer.features) {
console.log(`Layer ${url} is empty`);
} else {
console.warn("Too much data from", url);
console.warn(`Too much data from ${url} (${layer.features?.length ?? layer.length}${limit} features)`);
this.showSnack([`Layer ${l.layer.options.userLayerName ? ""+l.layer.options.userLayerName+" " : ""}is too large: ${layer.features?.length ?? layer.length} features; maximum is ${limit}`, "error"]);
}
})
.finally( () => {
@@ -674,13 +681,140 @@ export default {
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
this.labels = await this.api([url]) || [];
},
...mapActions(["api"])
removeUserLayers () {
map.eachLayer( layer => {
if (layer.options.userLayer === true) {
console.log("Removing", layer);
layer.eachLayer( sublayer => {
const idx = this.layerRefreshConfig.findIndex(i => i.layer == layer);
if (idx != -1) {
this.layerRefreshConfig.splice(idx, 1);
}
});
map.removeLayer(layer);
this.references.layerControl.removeLayer(layer);
}
});
},
async addUserLayers (userLayers) {
const options = {
userLayer: true,
style (feature) {
const style = {
stroke: undefined,
color: "grey",
weight: 2,
opacity: 0.5,
lineCap: undefined,
lineJoin: undefined,
dashArray: undefined,
dashOffset: undefined,
fill: undefined,
fillColor: "lightgrey",
fillOpacity: 0.5,
fillRule: undefined
};
for (let key in style) {
switch (key) {
case "color":
style[key] = feature.properties?.colour ?? feature.properties?.color ?? style[key];
break;
case "fillColor":
style[key] = feature.properties?.fillColour ?? feature.properties?.fillColor ?? style[key];
break;
default:
style[key] = feature.properties?.[key] ?? style[key];
}
if (typeof style[key] === "undefined") {
delete style[key];
}
}
return style;
}
};
const userLayerGroups = {};
userLayers.forEach(layer => {
if (!(layer.name in userLayerGroups)) {
userLayerGroups[layer.name] = [];
}
userLayerGroups[layer.name].push(layer);
});
for (let userLayerName in userLayerGroups) {
const userLayerGroup = userLayerGroups[userLayerName];
const layer = L.featureGroup(null, {userLayer: true, userLayerGroup: true, userLayerName});
userLayerGroup.forEach(l => {
const sublayer = L.geoJSON(null, {...options, userLayerName});
layer.addLayer(sublayer);
sublayer.on('add', ({target}) => {
this.refreshLayers([target])
});
if (l.tooltip) {
sublayer.bindTooltip((layer) => {
return layer?.feature?.properties?.[l.tooltip] ?? userLayerName;
});
}
if (l.popup) {
if (l.popup === true) {
sublayer.bindPopup((layer) => {
const p = layer?.feature?.properties;
let t = "";
if (p) {
t += "<table>";
for (let [k, v] of Object.entries(p)) {
t += `<tr><td><b>${k}: </b></td><td>${v}</td></tr>`;
}
t += "</table>";
return t;
}
return userLayerName;
});
} else {
sublayer.binPopup((layer) => {
return layer?.feature?.properties?.[l.popup] ?? userLayerName;
});
}
}
const refreshConfig = {
layer: sublayer,
url: (query = "") => {
return `/files/${l.path}`;
}
};
this.layerRefreshConfig.push(refreshConfig);
});
layer.on('add', ({target}) => {
this.refreshLayers(target.getLayers())
});
this.references.layerControl.addOverlay(layer, `<span title="User layer" style="text-decoration: dotted underline;">${userLayerName}</span>`);
}
},
async fetchUserLayers () {
const url = `/project/${this.$route.params.project}/gis/layer`;
const userLayers = await this.api([url]) || [];
this.removeUserLayers();
this.addUserLayers(userLayers);
},
...mapActions(["api", "showSnack"])
},
@@ -750,7 +884,7 @@ export default {
}
if (init.activeLayers) {
init.activeLayers.forEach(l => layers[l].addTo(map));
init.activeLayers.forEach(l => layers[l]?.addTo(map));
} else {
layers.OpenSeaMap.addTo(map);
layers.Preplots.addTo(map);
@@ -759,6 +893,9 @@ export default {
const layerControl = L.control.layers(tileMaps, layers).addTo(map);
const scaleControl = L.control.scale().addTo(map);
this.references.layerControl = layerControl;
this.references.scaleControl = scaleControl;
if (init.position) {
map.setView(init.position.slice(1), init.position[0]);
} else {
@@ -786,10 +923,13 @@ export default {
map.on('layeradd', this.updateURL);
map.on('layerremove', this.updateURL);
this.layerRefreshConfig.forEach( l => {
l.layer.on('add', ({target}) => this.refreshLayers([target]));
});
this.fetchUserLayers();
if (init.position) {
this.refreshLayers();
} else {

View File

@@ -119,7 +119,11 @@
>
<template v-slot:item.srss="{item}">
<v-icon small :title="srssInfo(item)">{{srssIcon(item)}}</v-icon>
<span style="white-space: nowrap;">
<v-icon small :title="srssInfo(item)">{{srssIcon(item)}}</v-icon>
/
<v-icon small :title="wxInfo(item)" v-if="item.meta.wx">{{wxIcon(item)}}</v-icon>
</span>
</template>
<template v-slot:item.sequence="{item, value}">
@@ -422,6 +426,123 @@ export default {
plannerConfig: null,
shiftAll: false, // Shift all sequences checkbox
// Weather API
wxData: null,
weathercode: {
0: {
description: "Clear sky",
icon: "mdi-weather-sunny"
},
1: {
description: "Mainly clear",
icon: "mdi-weather-sunny"
},
2: {
description: "Partly cloudy",
icon: "mdi-weather-partly-cloudy"
},
3: {
description: "Overcast",
icon: "mdi-weather-cloudy"
},
45: {
description: "Fog",
icon: "mde-weather-fog"
},
48: {
description: "Depositing rime fog",
icon: "mdi-weather-fog"
},
51: {
description: "Light drizzle",
icon: "mdi-weather-partly-rainy"
},
53: {
description: "Moderate drizzle",
icon: "mdi-weather-partly-rainy"
},
55: {
description: "Dense drizzle",
icon: "mdi-weather-rainy"
},
56: {
description: "Light freezing drizzle",
icon: "mdi-weather-partly-snowy-rainy"
},
57: {
description: "Freezing drizzle",
icon: "mdi-weather-partly-snowy-rainy"
},
61: {
description: "Light rain",
icon: "mdi-weather-rainy"
},
63: {
description: "Moderate rain",
icon: "mdi-weather-rainy"
},
65: {
description: "Heavy rain",
icon: "mdi-weather-pouring"
},
66: {
description: "Light freezing rain",
icon: "mdi-loading"
},
67: {
description: "Freezing rain",
icon: "mdi-loading"
},
71: {
description: "Light snow",
icon: "mdi-loading"
},
73: {
description: "Moderate snow",
icon: "mdi-loading"
},
75: {
description: "Heavy snow",
icon: "mdi-loading"
},
77: {
description: "Snow grains",
icon: "mdi-loading"
},
80: {
description: "Light rain showers",
icon: "mdi-loading"
},
81: {
description: "Moderate rain showers",
icon: "mdi-loading"
},
82: {
description: "Violent rain showers",
icon: "mdi-loading"
},
85: {
description: "Light snow showers",
icon: "mdi-loading"
},
86: {
description: "Snow showers",
icon: "mdi-loading"
},
95: {
description: "Thunderstorm",
icon: "mdi-loading"
},
96: {
description: "Hailstorm",
icon: "mdi-loading"
},
99: {
description: "Heavy hailstorm",
icon: "mdi-loading"
},
},
// Context menu stuff
contextMenuShow: false,
contextMenuX: 0,
@@ -630,6 +751,113 @@ export default {
return text.join("\n");
},
wxInfo (line) {
function atm(key) {
return line.meta?.wx?.atmospheric?.hourly[key];
}
function mar(key) {
return line.meta?.wx?.marine?.hourly[key];
}
const code = atm("weathercode");
const description = this.weathercode[code]?.description ?? `WMO code ${code}`;
const wind_speed = Math.round(atm("windspeed_10m"));
const wind_direction = String(Math.round(atm("winddirection_10m"))).padStart(3, "0");
const pressure = Math.round(atm("surface_pressure"));
const temperature = Math.round(atm("temperature_2m"));
const humidity = atm("relativehumidity_2m");
const precipitation = atm("precipitation");
const precipitation_probability = atm("precipitation_probability");
const precipitation_str = precipitation_probability
? `\nPrecipitation ${precipitation} mm (prob. ${precipitation_probability}%)`
: ""
const wave_height = mar("wave_height").toFixed(1);
const wave_direction = mar("wave_direction");
const wave_period = mar("wave_period");
return `${description}\n${temperature}° C\n${pressure} hPa\nWind ${wind_speed} kt ${wind_direction}°\nRelative humidity ${humidity}%${precipitation_str}\nWaves ${wave_height} m ${wave_direction}° @ ${wave_period} s`;
},
wxIcon (line) {
const code = line.meta?.wx?.atmospheric?.hourly?.weathercode;
return this.weathercode[code]?.icon ?? "mdi-help";
},
async wxQuery (line) {
function midpoint(line) {
// WARNING Fails if across the antimeridian
const longitude = (line.geometry.coordinates[0][0] + line.geometry.coordinates[1][0])/2;
const latitude = (line.geometry.coordinates[0][1] + line.geometry.coordinates[1][1])/2;
return [ longitude, latitude ];
}
function extract (fcst) {
const τ = (line.ts0.valueOf() + line.ts1.valueOf()) / 2000;
const [idx, ε] = fcst?.hourly?.time?.reduce( (acc, cur, idx) => {
const δ = Math.abs(cur - τ);
const retval = acc
? acc[1] < δ
? acc
: [ idx, δ ]
: [ idx, δ ];
return retval;
});
if (idx) {
const hourly = {};
for (let key in fcst?.hourly) {
fcst.hourly[key] = fcst.hourly[key][idx];
}
}
return fcst;
}
async function fetch_atmospheric (opts) {
const { longitude, latitude, dt0, dt1 } = opts;
const url = `https://api.open-meteo.com/v1/forecast?latitude=${latitude}&longitude=${longitude}&hourly=temperature_2m,relativehumidity_2m,precipitation_probability,precipitation,weathercode,pressure_msl,surface_pressure,windspeed_10m,winddirection_10m&daily=uv_index_max&windspeed_unit=kn&timeformat=unixtime&timezone=GMT&start_date=${dt0}&end_date=${dt1}&format=json`;
const init = {};
const res = await fetch (url, init);
if (res?.ok) {
const data = await res.json();
return extract(data);
}
}
async function fetch_marine (opts) {
const { longitude, latitude, dt0, dt1 } = opts;
const url = `https://marine-api.open-meteo.com/v1/marine?latitude=${latitude}&longitude=${longitude}&hourly=wave_height,wave_direction,wave_period&timeformat=unixtime&timezone=GMT&start_date=${dt0}&end_date=${dt1}&format=json`;
const init = {};
const res = await fetch (url, init);
if (res?.ok) {
const data = await res.json();
return extract(data);
}
}
if (line) {
const [ longitude, latitude ] = midpoint(line);
const dt0 = line.ts0.toISOString().substr(0, 10);
const dt1 = line.ts1.toISOString().substr(0, 10);
return {
atmospheric: await fetch_atmospheric({longitude, latitude, dt0, dt1}),
marine: await fetch_marine({longitude, latitude, dt0, dt1})
};
}
},
lagAfter (item) {
const pos = this.items.indexOf(item)+1;
if (pos != 0) {
@@ -723,6 +951,9 @@ export default {
for (const item of this.items) {
item.ts0 = new Date(item.ts0);
item.ts1 = new Date(item.ts1);
this.wxQuery(item).then( (wx) => {
item.meta = {...item.meta, wx};
});
}
},

View File

@@ -98,7 +98,7 @@ export default {
methods: {
async list () {
this.items = await this.api(["/project/"]) || [];
this.items = await this.api(["/project"]) || [];
},
async summary (item) {

View File

@@ -433,10 +433,7 @@ export default {
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
const labelSet = {};
const labels = await this.api([url]) || [];
labels.forEach( l => labelSet[l.name] = l.data );
this.labels = labelSet;
this.labels = await this.api([url]) || {};
},
async getQCData () {

View File

@@ -10,7 +10,7 @@ const mw = require('./middleware');
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const verbose = process.env.NODE_ENV != 'test';
const app = express();
app.locals.version = "0.3.1"; // API version
app.locals.version = "0.4.0"; // API version
app.map = function(a, route){
route = route || '';
@@ -100,8 +100,8 @@ app.map({
get: [ mw.project.summary.get ],
},
'/project/:project/configuration': {
get: [ mw.auth.access.write, mw.project.configuration.get ], // Get project configuration
patch: [ mw.auth.access.write, mw.project.configuration.patch ], // Modify project configuration
get: [ mw.project.configuration.get ], // Get project configuration
patch: [ mw.auth.access.admin, mw.project.configuration.patch ], // Modify project configuration
},
/*
@@ -109,19 +109,25 @@ app.map({
*/
'/project/:project/gis': {
get: [ mw.gis.project.bbox ]
get: [ mw.etag.noSave, mw.gis.project.bbox ]
},
'/project/:project/gis/preplot': {
get: [ mw.gis.project.preplot ]
get: [ mw.etag.noSave, mw.gis.project.preplot ]
},
'/project/:project/gis/preplot/:featuretype(line|point)': {
get: [ mw.gis.project.preplot ]
get: [ mw.etag.noSave, mw.gis.project.preplot ]
},
'/project/:project/gis/raw/:featuretype(line|point)': {
get: [ mw.gis.project.raw ]
get: [ mw.etag.noSave, mw.gis.project.raw ]
},
'/project/:project/gis/final/:featuretype(line|point)': {
get: [ mw.gis.project.final ]
get: [ mw.etag.noSave, mw.gis.project.final ]
},
'/project/:project/gis/layer': {
get: [ mw.etag.noSave, mw.gis.project.layer.get ]
},
'/project/:project/gis/layer/:name': {
get: [ mw.etag.noSave, mw.gis.project.layer.get ]
},
/*
@@ -194,25 +200,25 @@ app.map({
'/project/:project/qc': {
'/results': {
// Get all QC results for :project
get: [ mw.qc.results.get ],
get: [ mw.etag.noSave, mw.qc.results.get ],
// Delete all QC results for :project
delete: [ mw.auth.access.write, mw.qc.results.delete ],
delete: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.delete ],
'/accept': {
post: [ mw.auth.access.write, mw.qc.results.accept ]
post: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.accept ]
},
'/unaccept': {
post: [ mw.auth.access.write, mw.qc.results.unaccept ]
post: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.unaccept ]
},
'/sequence/:sequence': {
// Get QC results for :project, :sequence
get: [ mw.qc.results.get ],
get: [ mw.etag.noSave, mw.qc.results.get ],
// Delete QC results for :project, :sequence
delete: [ mw.auth.access.write, mw.qc.results.delete ]
delete: [ mw.etag.noSave, mw.auth.access.write, mw.qc.results.delete ]
}
}
},
@@ -262,9 +268,9 @@ app.map({
get: [ mw.auth.access.write, mw.etag.noSave, mw.files.get ]
},
'/navdata/': {
get: [ mw.navdata.get ],
get: [ mw.etag.noSave, mw.navdata.get ],
'gis/:featuretype(line|point)': {
get: [ mw.gis.navdata.get ]
get: [ mw.etag.noSave, mw.gis.navdata.get ]
}
},
'/info/': {

View File

@@ -1,4 +1,4 @@
const expressJWT = require('express-jwt');
const {expressjwt: expressJWT} = require('express-jwt');
const cfg = require("../../../lib/config").jwt;
@@ -15,6 +15,7 @@ const options = {
secret: cfg.secret,
credentialsRequired: false,
algorithms: ['HS256'],
requestProperty: "user",
getToken
};

View File

@@ -33,7 +33,9 @@ function saveResponse (res) {
const cache = getCache(res);
const req = res.req;
console.log(`Saving ETag: ${req.method} ${req.url}${etag}`);
cache[req.url] = {etag, headers: res.getHeaders()};
const headers = structuredClone(res.getHeaders());
delete headers["set-cookie"];
cache[req.url] = {etag, headers};
}
}
};

View File

@@ -0,0 +1,85 @@
const { stringify } = require('csv');
const { transform, prepare } = require('../../../../../lib/sse');
const json = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const {events, sequences} = await prepare(req.params.project, query);
if ("download" in query || "d" in query) {
const extension = "csv";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
const columns = {
id: "id",
unix_epoch: (row) => Math.floor(row.tstamp/1000),
timestamp: (row) => (new Date(row.tstamp)).toISOString(),
sequence: "sequence",
point: "point",
text: "remarks",
labels: (row) => row.labels.join(";"),
latitude: (row) => {
if (row.meta.geometry?.type == "Point" && row.meta.geometry?.coordinates) {
return row.meta.geometry.coordinates[1];
}
},
longitude: (row) => {
if (row.meta.geometry?.type == "Point" && row.meta.geometry?.coordinates) {
return row.meta.geometry.coordinates[0];
}
}
};
let fields = [ "timestamp", "sequence", "point", "text", "labels", "latitude", "longitude", "id" ];
if (req.query.fields) {
fields = req.query.fields.split(/[,;:.\s+*|]+/);
}
let delimiter = req.query.delimiter || ",";
const stringifier = stringify({delimiter});
stringifier.on('error', (err) => {
console.error(err.message);
});
stringifier.on('readable', () => {
while((row = stringifier.read()) !== null) {
res.write(row);
}
});
res.status(200);
if (!req.query.header || req.query.header.toLowerCase() == "true" || req.query.header == "1") {
// Send header
stringifier.write(fields);
}
events.forEach( event => {
stringifier.write(fields.map( field => {
if (typeof columns[field] === "function") {
return columns[field](event);
} else {
return event[columns[field]];
}
}));
});
stringifier.end();
res.end();
next();
} catch (err) {
next(err);
}
};
module.exports = json;

View File

@@ -2,6 +2,7 @@ const json = require('./json');
const geojson = require('./geojson');
const seis = require('./seis');
const html = require('./html');
const csv = require('./csv');
const pdf = require('./pdf');
module.exports = async function (req, res, next) {
@@ -11,6 +12,7 @@ module.exports = async function (req, res, next) {
"application/geo+json": geojson,
"application/vnd.seis+json": seis,
"text/html": html,
"text/csv": csv,
"application/pdf": pdf
};

View File

@@ -2,5 +2,6 @@ module.exports = {
bbox: require('./bbox'),
preplot: require('./preplot'),
raw: require('./raw'),
final: require('./final')
final: require('./final'),
layer: require('./layer')
};

View File

@@ -0,0 +1,18 @@
const { gis } = require('../../../../../lib/db');
module.exports = async function (req, res, next) {
try {
const layers = await gis.project.layer.get(req.params.project, req.params.name);
if (req.params.name && (!layers || !layers.length)) {
res.status(404).json({message: "Not found"});
} else {
res.status(200).send(layers ?? []);
}
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,3 @@
module.exports = {
get: require('./get')
};

View File

@@ -1,10 +1,11 @@
const { label } = require('../../../lib/db');
const { project } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await label.list(req.params.project, req.query));
const labels = (await project.configuration.get(req.params.project))?.labels ?? {};
res.status(200).send(labels);
next();
} catch (err) {
next(err);

View File

@@ -69,7 +69,7 @@ class DetectFDSP {
point: prev._point,
remarks: "Last shotpoint of the day",
labels: ["LDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
meta: {auto: true, author: `*${this.constructor.name}*`}
};
const fdsp = {
@@ -77,7 +77,7 @@ class DetectFDSP {
point: cur._point,
remarks: "First shotpoint of the day",
labels: ["FDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
meta: {auto: true, author: `*${this.constructor.name}*`}
};
INFO("LDSP", ldsp);

View File

@@ -0,0 +1,128 @@
const { schema2pid } = require('../../lib/db/connection');
const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSoftStart {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
DEBUG("Queue busy");
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
try {
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
// TODO:
// Consider whether to remember if soft start / full volume events
// have already been emitted and wait until there is an online/offline
// transition before re-emitting.
// This may or may not be a good idea.
// Look for a soft start or full volume event
if (cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns) {
INFO("Soft start detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Soft start",
labels: [ "Daily", "Guns", "Prod" ],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
} else if (cur.num_active == cur.num_guns && prev.num_active < cur.num_active) {
INFO("Full volume detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Full volume",
labels: [ "Daily", "Guns", "Prod" ],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
ERROR("DetectSoftStart Error")
ERROR(err);
} finally {
cur.isPending = false;
}
}
}
async run (data) {
if (!data || data.channel !== "realtime") {
return;
}
if (!(data.payload && data.payload.new && data.payload.new.meta)) {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectSoftStart.MAX_QUEUE_SIZE) {
this.queue.push({
isPending: this.queue.length,
_schema: meta._schema,
tstamp: meta.tstamp ?? meta.time,
shot: meta.shot,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName,
num_guns: meta.num_guns,
num_active: meta.num_active
});
} else {
// FIXME Change to alert
ALERT("DetectSoftStart queue full at", this.queue.length);
}
this.processQueue();
}
}
module.exports = DetectSoftStart;

View File

@@ -1,23 +1,24 @@
const { schema2pid } = require('../../lib/db/connection');
const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSOLEOL {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
@@ -26,8 +27,10 @@ class DetectSOLEOL {
queue = [];
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
DEBUG("Queue busy");
setImmediate(() => this.processQueue());
return;
}
@@ -38,9 +41,15 @@ class DetectSOLEOL {
const sequence = Number(cur._sequence);
try {
DEBUG("Sequence", sequence);
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus != "online" && cur.lineStatus == "online" && sequence) {
INFO("Transition to ONLINE detected");
// DEBUG(cur);
// DEBUG(prev);
// console.log("TRANSITION TO ONLINE", prev, cur);
// Check if there are already FSP, FGSP events for this sequence
@@ -59,16 +68,22 @@ class DetectSOLEOL {
sequence,
point: cur._point,
remarks,
labels
labels,
meta: {auto: true, author: `*${this.constructor.name}*`}
}
// console.log(projectId, payload);
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
INFO("FSP already in the log. Doing nothing");
}
} else if (prev.lineStatus == "online" && cur.lineStatus != "online") {
INFO("Transition to OFFLINE detected");
// DEBUG(cur);
// DEBUG(prev);
// console.log("TRANSITION TO OFFLINE", prev, cur);
// Check if there are already LSP, LGSP events for this sequence
@@ -87,14 +102,17 @@ class DetectSOLEOL {
sequence,
point: prev._point,
remarks,
labels
labels,
meta: {auto: true, author: `*${this.constructor.name}*`}
}
// console.log(projectId, payload);
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
INFO("LSP already in the log. Doing nothing");
}
}
// Processing of this shot has already been completed.

View File

@@ -1,5 +1,7 @@
const Handlers = [
require('./detect-soleol'),
require('./detect-soft-start'),
require('./report-line-change-time'),
require('./detect-fdsp')
];

View File

@@ -0,0 +1,301 @@
const { schema2pid } = require('../../lib/db/connection');
const { event, project } = require('../../lib/db');
const { withinValidity } = require('../../lib/utils/ranges');
const unique = require('../../lib/utils/unique');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class ReportLineChangeTime {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
author = `*${this.constructor.name}*`;
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 0) {
if (this.queue[0].isPending) {
DEBUG("Queue busy");
setImmediate(() => this.processQueue());
return;
}
const cur = this.queue.shift();
const next = this.queue[0];
const projectId = cur.pid;
// Are we being called because of a LGSP or because of a FGSP?
const forward = (cur.old?.labels?.includes("LGSP") || cur.new?.labels?.includes("LGSP"));
if (!projectId) {
WARNING("No projectID found in event", cur);
return;
}
async function getConfiguration (projectId) {
return await project.configuration.get(projectId);
}
async function getLineChangeTime (data, forward = false) {
if (forward) {
const ospEvents = await event.list(projectId, {label: "FGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp > data.tstamp).pop();
DEBUG("fsp", osp);
// DEBUG("data", data);
if (osp) {
DEBUG("lineChangeTime", osp.tstamp - data.tstamp);
return { lineChangeTime: osp.tstamp - data.tstamp, osp };
}
} else {
const ospEvents = await event.list(projectId, {label: "LGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp < data.tstamp).shift();
DEBUG("lsp", osp);
// DEBUG("data", data);
if (osp) {
DEBUG("lineChangeTime", data.tstamp - osp.tstamp);
return { lineChangeTime: data.tstamp - osp.tstamp, osp };
}
}
}
function parseInterval (dt) {
const daySeconds = (dt/1000) % 86400;
const d = Math.floor((dt/1000) / 86400);
const dateObject = new Date(null);
dateObject.setSeconds(daySeconds);
const [ h, m, s ] = dateObject.toISOString().slice(11, 19).split(":").map(Number);
return {d, h, m, s};
}
function formatInterval (i) {
let str = "";
for (let [k, v] of Object.entries(i)) {
if (v) {
str += " " + v + " " + k;
}
}
return str.trim();
}
const deleteStaleEvents = async (seq) => {
if (seq) {
DEBUG("Will delete lct events related to sequence(s)", seq);
const jpq = `$."${this.author}"`;
const opts = {jpq};
if (Array.isArray(seq)) {
opts.sequences = unique(seq).filter(i => !!i);
} else {
opts.sequence = seq;
}
const staleEvents = await event.list(projectId, opts);
DEBUG(staleEvents.length ?? 0, "events to delete");
for (let staleEvent of staleEvents) {
DEBUG(`Deleting event id ${staleEvent.id} (seq = ${staleEvent.sequence}, point = ${staleEvent.point})`);
await event.del(projectId, staleEvent.id);
}
}
}
const createLineChangeTimeEvents = async (lineChangeTime, data, osp) => {
const events = [];
const cfg = (await project.configuration.get(projectId));
const nlcd = cfg?.production?.nominalLineChangeDuration * 60*1000; // m → ms
DEBUG("nlcd", nlcd);
if (nlcd && lineChangeTime > nlcd) {
const excess = lineChangeTime-nlcd;
const excessString = formatInterval(parseInterval(excess));
DEBUG("excess", excess, excessString);
// ref: The later of the two events
const ref = forward ? osp : data;
const payload = {
// tstamp: new Date(ref.tstamp-1),
sequence: ref.sequence,
point: ref.point,
remarks: `_Nominal line change duration exceeded by ${excessString}_`,
labels: [ "Nav", "Prod" ],
meta: {
auto: true,
author: this.author,
[this.author]: {
parents: [
data.id,
osp.id
],
type: "excess",
value: excess
}
}
}
events.push(payload);
DEBUG("Created line change duration exceeded event", projectId, payload);
}
const lctString = formatInterval(parseInterval(lineChangeTime));
// ref: The later of the two events
const ref = forward ? osp : data;
const payload = {
// tstamp: new Date(ref.tstamp-1),
sequence: ref.sequence,
point: ref.point,
remarks: `Line change time: ${lctString}`,
labels: [ "Nav", "Prod" ],
meta: {
auto: true,
author: this.author,
[this.author]: {
parents: [
data.id,
osp.id
],
type: "lineChangeTime",
value: lineChangeTime
}
}
};
events.push(payload);
DEBUG("Created line change duration event", projectId, payload);
return events;
}
const maybePostEvent = async (projectId, payload) => {
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
}
try {
// DEBUG("Previous", prev);
DEBUG("Current", cur);
DEBUG("Forward search", forward);
// We have these scenarios to consider:
// INSERT:
// `old` will be NULL
// Add event with line change time:
// - match validity with `new`
// - meta.ReportLineChangeTime.link refers to new.uid (or new.id?)
// UPDATE:
// `old` is not NULL
// `new` is not NULL
// Delete previous event from event_log (not event_log_full)
// Add event with line change time:
// - match validity with `new`
// - meta.ReportLineChangeTime.link refers to new.uid (or new.id?)
// DELETE:
// `old` is not NULL
// `new` will be NULL
// Delete previous event from event_log (not event_log_full)
await deleteStaleEvents([cur.old?.sequence, cur.new?.sequence]);
if (cur.operation == "INSERT") {
// NOTE: UPDATE on the event_log view translates to one UPDATE plus one INSERT
// on event_log_full, so we don't need to worry about UPDATE here.
const data = cur.new;
DEBUG("INSERT seen: will add lct events related to ", data.id);
if (withinValidity(data.validity)) {
DEBUG("Event within validity period", data.validity, new Date());
data.tstamp = new Date(data.tstamp);
const { lineChangeTime, osp } = await getLineChangeTime(data, forward);
if (lineChangeTime) {
const events = await createLineChangeTimeEvents(lineChangeTime, data, osp);
if (events?.length) {
DEBUG("Deleting other events for sequence", events[0].sequence);
await deleteStaleEvents(events[0].sequence);
}
for (let payload of events) {
await maybePostEvent(projectId, payload);
}
}
} else {
DEBUG("Event outside of validity range", data.validity, "lct events not inserted");
}
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
ERROR("ReportLineChangeTime Error")
ERROR(err);
} finally {
if (next) {
next.isPending = false;
}
}
}
}
async run (data) {
if (!data || data.channel !== "event") {
return;
}
if (!(data.payload?.new?.labels) && !(data.payload?.old?.labels)) {
return;
}
const n = data.payload.new;
const o = data.payload.old;
if (!n?.labels?.includes("FGSP") && !o?.labels?.includes("FGSP") &&
!n?.labels?.includes("LGSP") && !o?.labels?.includes("LGSP")) {
return;
}
if (this.queue.length < ReportLineChangeTime.MAX_QUEUE_SIZE) {
this.queue.push({
...data.payload,
isPending: this.queue.length,
});
} else {
ALERT("ReportLineChangeTime queue full at", this.queue.length);
}
this.processQueue();
}
}
module.exports = ReportLineChangeTime;

View File

@@ -1,4 +1,4 @@
const { listen } = require('../ws/db');
const { listen } = require('../lib/db/notify');
const channels = require('../lib/db/channels');
const handlers = require('./handlers').init();
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);

View File

@@ -1,6 +1,6 @@
#!/usr/bin/node
const { INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
async function main () {
// Check that we're running against the correct database version

View File

@@ -10,25 +10,34 @@ async function list (projectId, opts = {}) {
const offset = Math.abs((opts.page-1)*opts.itemsPerPage) || 0;
const limit = Math.abs(Number(opts.itemsPerPage)) || null;
const filter = opts.sequence
? String(opts.sequence).includes(";")
? [ "sequence = ANY ( $1 )", [ opts.sequence.split(";") ] ]
: [ "sequence = $1", [ opts.sequence ] ]
: opts.date0
? opts.date1
? [ "date(tstamp) BETWEEN SYMMETRIC $1 AND $2", [ opts.date0, opts.date1 ] ]
: [ "date(tstamp) = $1", [ opts.date0 ] ]
: [ "true = true", [] ];
const sequence = opts.sequence && Number(opts.sequence) || null;
const sequences = opts.sequences && (Array.isArray(opts.sequences)
? opts.sequences.map(Number)
: opts.sequences.split(/[^0-9]+/).map(Number)) || null;
const date0 = opts.date0 ?? null;
const date1 = opts.date1 ?? null;
const jpq = opts.jpq || null;
const label = opts.label ?? null;
const text = `
SELECT *
FROM event_log e
WHERE
${filter[0]}
ORDER BY ${sortKey} ${sortDir};
($1::numeric IS NULL OR sequence = $1) AND
($2::numeric[] IS NULL OR sequence = ANY( $2 )) AND
($3::timestamptz IS NULL OR date(tstamp) = $3) AND
($3::timestamptz IS NULL OR
(($4::timestamptz IS NULL AND date(tstamp) = $3) OR
date(tstamp) BETWEEN SYMMETRIC $3 AND $4)) AND
($5::jsonpath IS NULL OR jsonb_path_exists(meta::jsonb, $5::jsonpath)) AND
($6::text IS NULL OR $6 = ANY(labels))
ORDER BY ${sortKey} ${sortDir}
LIMIT ${limit};
`;
const res = await client.query(text, filter[1]);
const values = [ sequence, sequences, date0, date1, jpq, label ];
const res = await client.query(text, values);
client.release();
return res.rows.map(i => replaceMarkers(i));
}

View File

@@ -9,10 +9,10 @@ async function post (projectId, payload, opts = {}) {
const text = `
INSERT
INTO event_log (tstamp, sequence, point, remarks, labels)
VALUES ($1, $2, $3, replace_placeholders($4, $1, $2, $3), $5);
INTO event_log (tstamp, sequence, point, remarks, labels, meta)
VALUES ($1, $2, $3, replace_placeholders($4, $1, $2, $3), $5, $6);
`;
const values = [ p.tstamp, p.sequence, p.point, p.remarks, p.labels ];
const values = [ p.tstamp, p.sequence, p.point, p.remarks, p.labels, p.meta ];
DEBUG("Inserting new values: %O", values);
await client.query(text, values);

View File

@@ -3,5 +3,6 @@ module.exports = {
bbox: require('./bbox'),
preplot: require('./preplot'),
raw: require('./raw'),
final: require('./final')
final: require('./final'),
layer: require('./layer')
};

View File

@@ -0,0 +1,31 @@
const { setSurvey } = require('../../../connection');
async function get (projectId, layerName = null, options = {}) {
const client = await setSurvey(projectId);
const text = `
SELECT path, (data - 'type') data
FROM files f
INNER JOIN file_data
USING (hash)
WHERE data->>'type' = 'map_layer'
AND data->>'format' = 'geojson'
AND (data->>'name' = $1
OR $1 IS NULL);
`;
const values = [ layerName ];
const res = await client.query(text, values);
client.release();
if (res.rows && res.rows.length) {
return res.rows.map(row => ({...row.data, path: row.path}));
} else {
throw {status: 404, message: "Not found"};
}
}
module.exports = get;

View File

@@ -0,0 +1,4 @@
module.exports = {
get: require('./get')
};

View File

@@ -1,6 +1,6 @@
// FIXME This code is in painful need of refactoring
const { DEBUG } = require("DOUGAL_ROOT/debug")(__filename);
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const { setSurvey, transaction, pool } = require('../connection');
let last_tstamp = 0;
@@ -8,14 +8,10 @@ let last_tstamp = 0;
async function getAllProjectConfigs () {
const client = await pool.connect();
const res0 = await client.query("SELECT schema FROM projects;");
const text = res0.rows.map(r => {
return `SELECT '${r.schema}' AS schema, data FROM ${r.schema}.file_data WHERE (data->>'archived')::boolean IS NOT true AND data->>'id' IS NOT NULL`;
}).join("\nUNION ALL ");
const res1 = await client.query(text);
const text = `SELECT schema, meta AS data FROM projects;`;
const res = await client.query(text);
client.release();
return res1.rows.map(r => Object.assign(r.data, {schema: r.schema}));
return res.rows;
}
async function getNearestPreplot (candidates) {
@@ -74,9 +70,9 @@ async function getNearestOfflinePreplot (candidates) {
if ("latitude" in candidates[0] && "longitude" in candidates[0]) {
text = `
SELECT
'${c._schema}' AS _schema,
'${c.schema}' AS schema,
ST_Distance(ST_Transform(ST_SetSRID(ST_MakePoint($1, $2), 4326), ST_SRID(geometry)), geometry) AS distance
FROM ${c._schema}.preplot_points
FROM ${c.schema}.preplot_points
ORDER BY distance ASC
LIMIT 1;
`;
@@ -84,9 +80,9 @@ async function getNearestOfflinePreplot (candidates) {
} else if ("easting" in candidates[0] && "northing" in candidates[0]) {
text = `
SELECT
'${c._schema}' AS _schema,
'${c.schema}' AS schema,
ST_Distance(ST_SetSRID(ST_MakePoint($1, $2), ST_SRID(geometry)), geometry) AS distance
FROM ${c._schema}.preplot_points
FROM ${c.schema}.preplot_points
ORDER BY distance ASC
LIMIT 1;
`;
@@ -102,13 +98,13 @@ async function getNearestOfflinePreplot (candidates) {
const results = [];
for (const qry of queries) {
const res = await client.query(qry.text, qry.values);
if (res.rows[0] && res.rows[0]._schema) {
if (res.rows[0] && res.rows[0].schema) {
results.push(res.rows[0]);
}
}
client.release();
const _schema = results.sort( (a, b) => a.distance - b.distance).shift()?._schema;
return candidates.find(c => c._schema == _schema);
const schema = results.sort( (a, b) => a.distance - b.distance).shift()?.schema;
return candidates.find(c => c.schema == schema);
}
async function saveOnline (dataset, opts = {}) {
@@ -141,14 +137,14 @@ async function saveOnline (dataset, opts = {}) {
await client.query(`
INSERT INTO raw_shots
(sequence, line, point, objref, tstamp, geometry, hash)
VALUES ($1, $2, $3, $4, $5, ST_SetSRID(ST_MakePoint($6, $7), (SELECT (data->>'epsg')::integer AS epsg FROM file_data WHERE data ? 'id')), '*online*')
VALUES ($1, $2, $3, $4, $5, ST_SetSRID(ST_MakePoint($6, $7), (select (project_configuration()->>'epsg')::integer as epsg)), '*online*')
ON CONFLICT DO NOTHING;
`, [item.sequence, item.line, item.point, 0, item.tstamp, item.easting, item.northing]);
} else if (item.latitude && item.longitude) {
await client.query(`
INSERT INTO raw_shots
(sequence, line, point, objref, tstamp, geometry, hash)
VALUES ($1, $2, $3, $4, $5, ST_Transform(ST_SetSRID(ST_MakePoint($6, $7), 4326), (SELECT (data->>'epsg')::integer AS epsg FROM file_data WHERE data ? 'id')), '*online*')
VALUES ($1, $2, $3, $4, $5, ST_Transform(ST_SetSRID(ST_MakePoint($6, $7), 4326), (select (project_configuration()->>'epsg')::integer as epsg)), '*online*')
ON CONFLICT DO NOTHING;
`, [item.sequence, item.line, item.point, 0, item.tstamp, item.longitude, item.latitude]);
} else {
@@ -158,8 +154,8 @@ async function saveOnline (dataset, opts = {}) {
}
await transaction.commit(client);
} catch (error) {
console.error("ONLINE DATA INSERT ERROR");
console.error(error);
ERROR("ONLINE DATA INSERT ERROR");
ERROR(error);
await transaction.rollback(client);
} finally {
client.release();
@@ -186,7 +182,7 @@ async function saveOffline (navData, opts = {}) {
} else if (schema && hasEastNorth) {
const text = `
INSERT INTO real_time_inputs (tstamp, geometry, meta)
VALUES ($1, ST_Transform(ST_SetSRID(ST_MakePoint($2, $3), (SELECT (data->>'epsg')::integer AS epsg FROM ${schema}.file_data)), 4326), $4);
VALUES ($1, ST_Transform(ST_SetSRID(ST_MakePoint($2, $3), (select (project_configuration()->>'epsg')::integer as epsg), 4326), $4);
`;
const values = [navData.tstamp, navData.longitude, navData.latitude, navData.payload];
@@ -215,6 +211,37 @@ async function saveOffline (navData, opts = {}) {
client.release();
}
async function getCandidates (navData) {
const configs = await getAllProjectConfigs();
// We just get the bits of interest: pattern and schema
const candidates = configs.map(c => {
if (!c?.data?.online?.line || c?.archived === true) {
return null;
}
const p = c.data.online.line.pattern; // For short
const rx = new RegExp(p.regex, p.flags);
const matches = navData.lineName.match(rx);
if (!matches || ((matches.length+1) < p.captures.length)) {
return null;
}
matches.shift(); // Get rid of the full matched text
const obj = Object.assign({}, navData, {schema: c.schema});
p.captures.forEach( (k, i) => {
obj[k] = matches[i];
});
return obj;
}).filter(c => !!c);
DEBUG("Candidates: %j", candidates.map(c => c.schema));
return candidates;
}
async function save (navData, opts = {}) {
const hasLatLon = ("latitude" in navData && "longitude" in navData);
@@ -222,44 +249,21 @@ async function save (navData, opts = {}) {
const hasLinePoint = ("lineName" in navData && "point" in navData);
if (!(hasLinePoint || hasLatLon || hasEastNorth)) {
// This is of no interest to us
console.warning("Ignoring data without useful values", navData);
NOTICE("Ignoring data without useful values", navData);
return;
}
// DEBUG("navData", navData);
if (navData.online === true) {
// So we have a lineName, see which projects match the line pattern.
// For this we need to get all the project configs
const configs = await getAllProjectConfigs();
// We just get the bits of interest: pattern and schema
const candidates = configs.map(c => {
if (!(c && c.online && c.online.line)) {
return null;
}
const p = c.online.line.pattern; // For short
const rx = new RegExp(p.regex, p.flags);
const matches = navData.lineName.match(rx);
if (!matches || ((matches.length+1) < p.captures.length)) {
return null;
}
matches.shift(); // Get rid of the full matched text
const obj = Object.assign({}, navData, {schema: c.schema});
p.captures.forEach( (k, i) => {
obj[k] = matches[i];
});
return obj;
}).filter(c => !!c);
DEBUG("Candidates: %j", candidates);
// console.log("CANDIDATES", candidates);
const candidates = await getCandidates(navData);
if (candidates.length == 0) {
// This is probably a test line, so we treat it as offline
console.log("No match");
WARNING("No match");
} else {
if (candidates.length == 1) {
// Only one candidate, associate with it
@@ -275,7 +279,7 @@ async function save (navData, opts = {}) {
await saveOnline(candidates.filter(c => c.schema == destinationSchema), opts);
navData.payload._schema = destinationSchema;
} else {
console.log("Nowhere to save to");
WARNING("Nowhere to save to");
}
}
@@ -286,17 +290,18 @@ async function save (navData, opts = {}) {
}
} else {
// We are offline. We only assign _schema once every save_interval seconds at most
// unless there is gun data present.
if (opts.offline_survey_heuristics == "nearest_preplot") {
const now = Date.now();
const do_save = !opts.offline_survey_detect_interval ||
(now - last_tstamp) >= opts.offline_survey_detect_interval;
if (do_save) {
if (do_save || "guns" in navData?.payload) {
const configs = await getAllProjectConfigs();
const candidates = configs.map(c => Object.assign({}, navData, {_schema: c.schema}));
const candidates = await getCandidates(navData);
const bestCandidate = await getNearestOfflinePreplot(candidates);
if (bestCandidate) {
navData.payload._schema = bestCandidate._schema;
navData.payload._schema = bestCandidate.schema;
last_tstamp = now;
}
}

View File

@@ -1,5 +1,5 @@
const { setSurvey } = require('../../connection');
const { deepMerge } = require('../../../utils');
const { deepMerge, removeNulls } = require('../../../utils');
const { modify } = require('../create');
@@ -22,21 +22,24 @@ async function patch (projectId, payload, opts = {}) {
throw { status: 404, message: "Not found" };
}
if (("id" in payload) && (projectId != payload.id)) {
if (("id" in payload) && (projectId.toLowerCase() != payload.id.toLowerCase())) {
throw {
status: 422,
message: "Project ID cannot be changed in this Dougal version"
}
}
if (("name" in payload) && (source.name != payload.name)) {
if (("name" in payload) && source.name && (source.name != payload.name)) {
throw {
status: 422,
message: "Project name cannot be changed in this Dougal version"
}
}
const dest = deepMerge(source, payload);
// We do not allow users to change the schema
delete payload.schema;
const dest = removeNulls(deepMerge(source, payload));
await modify(projectId, dest);
return dest;

View File

@@ -7,10 +7,11 @@ const { INFO, DEBUG, WARNING, ERROR } = require('DOUGAL_ROOT/debug')(__filename)
function checkSyntax (value, type = "project") {
var requiredFields = {};
switch (type) {
case "project":
var requiredFields = {
requiredFields = {
id: "string",
name: "string",
epsg: "number",
@@ -18,7 +19,7 @@ function checkSyntax (value, type = "project") {
};
break;
case "binning":
var requiredFields = {
requiredFields = {
theta: "number",
I_inc: "number",
J_inc: "number",
@@ -28,23 +29,19 @@ function checkSyntax (value, type = "project") {
}
break
case "origin":
var requiredFields = {
requiredFields = {
easting: "number",
northing: "number",
I: "number",
J: "number"
}
break;
break;
default:
return typeof type == "function"
? type(value)
: typeof value == type;
}
// return Object.entries(requiredFields).every( ([field, test]) => {
// return value.hasOwnProperty(field) && checkSyntax(value[field], test);
// });
for (const [field, test] of Object.entries(requiredFields)) {
if (!value.hasOwnProperty(field)) {
return `Missing required property: ${field}`;

View File

@@ -1,14 +1,15 @@
const fs = require('fs');
const YAML = require('yaml');
const flattenQCDefinitions = require('../../../utils/flattenQCDefinitions');
const configuration = require('../../configuration'); // lib/db/configuration
const { translatePath } = require('../../../utils/logicalPath');
const project = require('../../project'); // lib/db/project
async function get (projectId, opts = {}) {
const qcConfig = await configuration.get(projectId, "qc");
const qcConfig = (await project.configuration.get(projectId))?.qc;
if (qcConfig?.definitions) {
try {
const definitions = YAML.parse(fs.readFileSync(qcConfig.definitions).toString());
const definitions = YAML.parse(fs.readFileSync(translatePath(qcConfig.definitions)).toString());
return opts.flat ? flattenQCDefinitions(definitions) : definitions;
} catch (err) {

View File

@@ -1,7 +1,7 @@
const fs = require('fs/promises');
const Path = require('path');
const mime = require('./mime-types');
const { translatePath, logicalRoot } = require('./logical');
const { translatePath, logicalRoot } = require('../utils/logicalPath');
const systemCfg = require('../config');
const projectCfg = require('../db/configuration');

View File

@@ -8,6 +8,7 @@ const { pool, setSurvey, transaction, fetchRow } = require('../db/connection')
const { project, sequence, configuration, info } = require('../db')
const flattenQCDefinitions = require('./flatten');
const { projectHash, sequenceHash } = require('./last-modified');
const { translatePath } = require('../utils/logicalPath');
const { runShotsQC, saveShotsQC } = require('./shots');
const { runSequenceQCs, saveSequenceQCs } = require('./sequences');
@@ -42,12 +43,12 @@ function forceQC (projectId, sequenceNumber) {
async function getProjectQCConfig (projectId) {
console.log("getProjectQCConfig");
const qcConfig = await configuration.get(projectId, "qc");
const qcConfig = (await project.configuration.get(projectId))?.qc;
console.log("qcConfig", qcConfig);
if (qcConfig?.definitions && qcConfig?.parameters) {
const definitions =
flattenQCDefinitions(YAML.parse(fs.readFileSync(qcConfig.definitions).toString()));
const parameters = YAML.parse(fs.readFileSync(qcConfig.parameters).toString());
flattenQCDefinitions(YAML.parse(fs.readFileSync(translatePath(qcConfig.definitions)).toString()));
const parameters = YAML.parse(fs.readFileSync(translatePath(qcConfig.parameters)).toString());
return { definitions, parameters };
}
@@ -56,14 +57,14 @@ async function getProjectQCConfig (projectId) {
async function main () {
// Fetch list of projects
console.log("GET PROJECTS");
const projects = await project.list();
console.log("PROJECTS", projects);
const projects = await project.get();
for (const proj of projects) {
const projectId = proj.pid;
for (const {pid} of projects) {
const projectId = pid;
console.log("PROJECT ID", projectId);
const proj = await project.configuration.get(projectId);
if (!project.archived) {
if (!proj.archived) {
const QCTstamp = new Date();
const currentQCHash = await projectHash(projectId);
@@ -75,7 +76,7 @@ async function main () {
console.log("currentQCHash != lastQCHash", projectId, currentQCHash, lastQCHash);
// Fetch definitions and parameters
const { definitions, parameters } = await getProjectQCConfig(projectId) ?? {};
const { definitions, parameters } = await getProjectQCConfig(projectId, proj.qc) ?? {};
if (definitions && parameters) {
console.log("PROJECT ID", projectId);

View File

@@ -4,5 +4,9 @@ module.exports = {
dms: require('./dms'),
replaceMarkers: require('./replaceMarkers'),
flattenQCDefinitions: require('./flattenQCDefinitions'),
deepMerge: require('./deepMerge')
deepMerge: require('./deepMerge'),
removeNulls: require('./removeNulls'),
logicalPath: require('./logicalPath'),
ranges: require('./ranges'),
unique: require('./unique')
};

View File

@@ -10,6 +10,7 @@ function translatePath (file) {
return physicalPath;
} else {
// An attempt to break out of the logical path?
console.warn("Attempting to break out of the logical path?", physicalPath, prefix);
throw {
status: 404,
message: "Not found"

View File

@@ -0,0 +1,74 @@
function parseRange (str) {
const rx = /^[\[(].*,.*[)\]]$/
if (rx.test(str)) {
const lower_inclusive = str[0] == '[';
const upper_inclusive = str[str.length-1] == ']';
const [ lower, upper ] = str.slice(1,-1).split(",");
return {
upper,
lower,
upper_inclusive,
lower_inclusive
};
}
}
function parseValidity (str) {
const range = parseRange(str);
if (range) {
ts0 = range.lower ? new Date(range.lower) : null;
ts1 = range.upper ? new Date(range.upper) : null;
return {
...range,
lower: ts0,
upper: ts1
};
}
}
function withinValidity (range, ts) {
if (!ts) {
ts = new Date();
}
if (typeof range === "string") {
range = parseValidity(range);
}
if (range.lower) {
if (range.lower_inclusive) {
if (!(range.lower <= ts)) {
return false;
}
} else {
if (!(range.lower < ts)) {
return false;
}
}
}
if (range.upper) {
if (range.upper_inclusive) {
if (!(range.upper >= ts)) {
return false;
}
} else {
if (!(range.upper > ts)) {
return false;
}
}
}
return true;
}
module.exports = {
parseRange,
parseValidity,
withinValidity
}

View File

@@ -0,0 +1,23 @@
/**
* Delete keys whose value is null.
*
*/
function removeNulls (obj) {
function getType (obj) {
return Object.prototype.toString.call(obj).slice(8, -1).toLowerCase();
}
for (let [key, value] of Object.entries(obj)) {
if (value === null) {
delete obj[key];
} else if (getType(value) == "object") {
removeNulls(value);
}
}
return obj;
}
module.exports = removeNulls;

View File

@@ -0,0 +1,6 @@
function unique(array) {
return [...new Set(array)];
}
module.exports = unique;

View File

@@ -15,11 +15,12 @@
"dependencies": {
"body-parser": "gitlab:aaltronav/contrib/expressjs/body-parser",
"cookie-parser": "^1.4.5",
"csv": "^6.3.3",
"debug": "^4.3.4",
"express": "^4.17.1",
"express-jwt": "^6.0.0",
"express-jwt": "^8.4.1",
"json2csv": "^5.0.6",
"jsonwebtoken": "^8.5.1",
"jsonwebtoken": "^9.0.2",
"leaflet-headless": "git+https://git@gitlab.com/aaltronav/contrib/leaflet-headless.git#devel",
"marked": "^4.0.12",
"netmask": "^2.0.2",
@@ -36,7 +37,7 @@
"redoc-cli": "^0.13.9"
},
"engines": {
"node": ">=14.0.0"
"node": ">=18.0.0"
}
},
"node_modules/@mapbox/node-pre-gyp": {
@@ -58,20 +59,6 @@
"node-pre-gyp": "bin/node-pre-gyp"
}
},
"node_modules/@mapbox/node-pre-gyp/node_modules/semver": {
"version": "7.5.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.5.1.tgz",
"integrity": "sha512-Wvss5ivl8TMRZXXESstBA4uR5iXgEN/VC5/sOcuXdVLzcdkz4HWetIoRfG5gb5X+ij/G9rw9YoGn3QoQ8OCSpw==",
"dependencies": {
"lru-cache": "^6.0.0"
},
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/@tootallnate/once": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/@tootallnate/once/-/once-1.1.2.tgz",
@@ -80,6 +67,19 @@
"node": ">= 6"
}
},
"node_modules/@types/jsonwebtoken": {
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-drE6uz7QBKq1fYqqoFKTDRdFCPHd5TCub75BM+D+cMx7NU9hUz7SESLfC2fSCXVFMO5Yj8sOWHuGqPgjc+fz0Q==",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@types/node": {
"version": "20.6.0",
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.6.0.tgz",
"integrity": "sha512-najjVq5KN2vsH2U/xyh2opaSEz6cZMR2SetLIlxlj08nOcmPOemJmUK2o4kUzfLqfrWE0PIrNeE16XhYDd3nqg=="
},
"node_modules/a-sync-waterfall": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/a-sync-waterfall/-/a-sync-waterfall-1.0.1.tgz",
@@ -221,11 +221,6 @@
"node": ">=0.8"
}
},
"node_modules/async": {
"version": "1.5.2",
"resolved": "https://registry.npmjs.org/async/-/async-1.5.2.tgz",
"integrity": "sha1-7GphrlZIDAw8skHJVhjiCJL5Zyo="
},
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
@@ -390,7 +385,7 @@
"node_modules/buffer-equal-constant-time": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz",
"integrity": "sha1-+OcRMvf/5uAaXJaXpMbz5I1cyBk="
"integrity": "sha512-zRpUiDwd/xk6ADqPMATG8vc9VPrkck7T07OIx0gnjmJAnHnTVXNQG3vfvWNuiZIkwu9KrKdA1iJKfsfTVxE6NA=="
},
"node_modules/buffer-writer": {
"version": "2.0.0",
@@ -581,6 +576,35 @@
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
},
"node_modules/csv": {
"version": "6.3.3",
"resolved": "https://registry.npmjs.org/csv/-/csv-6.3.3.tgz",
"integrity": "sha512-TuOM1iZgdDiB6IuwJA8oqeu7g61d9CU9EQJGzCJ1AE03amPSh/UK5BMjAVx+qZUBb/1XEo133WHzWSwifa6Yqw==",
"dependencies": {
"csv-generate": "^4.2.8",
"csv-parse": "^5.5.0",
"csv-stringify": "^6.4.2",
"stream-transform": "^3.2.8"
},
"engines": {
"node": ">= 0.1.90"
}
},
"node_modules/csv-generate": {
"version": "4.2.8",
"resolved": "https://registry.npmjs.org/csv-generate/-/csv-generate-4.2.8.tgz",
"integrity": "sha512-qQ5CUs4I58kfo90EDBKjdp0SpJ3xWnN1Xk1lZ1ITvfvMtNRf+jrEP8tNPeEPiI9xJJ6Bd/km/1hMjyYlTpY42g=="
},
"node_modules/csv-parse": {
"version": "5.5.0",
"resolved": "https://registry.npmjs.org/csv-parse/-/csv-parse-5.5.0.tgz",
"integrity": "sha512-RxruSK3M4XgzcD7Trm2wEN+SJ26ChIb903+IWxNOcB5q4jT2Cs+hFr6QP39J05EohshRFEvyzEBoZ/466S2sbw=="
},
"node_modules/csv-stringify": {
"version": "6.4.2",
"resolved": "https://registry.npmjs.org/csv-stringify/-/csv-stringify-6.4.2.tgz",
"integrity": "sha512-DXIdnnCUQYjDKTu6TgCSzRDiAuLxDjhl4ErFP9FGMF3wzBGOVMg9bZTLaUcYtuvhXgNbeXPKeaRfpgyqE4xySw=="
},
"node_modules/d3-queue": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/d3-queue/-/d3-queue-2.0.3.tgz",
@@ -836,23 +860,22 @@
}
},
"node_modules/express-jwt": {
"version": "6.1.1",
"resolved": "https://registry.npmjs.org/express-jwt/-/express-jwt-6.1.1.tgz",
"integrity": "sha512-m8gkY04v5jtiFZn6bYQINYX/DVXq1DVb5nIW7H8l87qJ4BBvtQKFRpxyRE31odct7OPfHdT+B8678zJHhlMrpw==",
"version": "8.4.1",
"resolved": "https://registry.npmjs.org/express-jwt/-/express-jwt-8.4.1.tgz",
"integrity": "sha512-IZoZiDv2yZJAb3QrbaSATVtTCYT11OcqgFGoTN4iKVyN6NBkBkhtVIixww5fmakF0Upt5HfOxJuS6ZmJVeOtTQ==",
"dependencies": {
"async": "^1.5.0",
"express-unless": "^1.0.0",
"jsonwebtoken": "^8.1.0",
"lodash": "^4.17.21"
"@types/jsonwebtoken": "^9",
"express-unless": "^2.1.3",
"jsonwebtoken": "^9.0.0"
},
"engines": {
"node": ">= 8.0.0"
}
},
"node_modules/express-unless": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/express-unless/-/express-unless-1.0.0.tgz",
"integrity": "sha512-zXSSClWBPfcSYjg0hcQNompkFN/MxQQ53eyrzm9BYgik2ut2I7PxAf2foVqBRMYCwWaZx/aWodi+uk76npdSAw=="
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/express-unless/-/express-unless-2.1.3.tgz",
"integrity": "sha512-wj4tLMyCVYuIIKHGt0FhCtIViBcwzWejX0EjNxveAa6dG+0XBCQhMbx+PnkLkFCxLC69qoFrxds4pIyL88inaQ=="
},
"node_modules/express/node_modules/body-parser": {
"version": "1.19.2",
@@ -1376,9 +1399,9 @@
]
},
"node_modules/jsonwebtoken": {
"version": "8.5.1",
"resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz",
"integrity": "sha512-XjwVfRS6jTMsqYs0EsuJ4LGxXV14zQybNd4L2r0UvbVnSF9Af8x7p5MzbJ90Ioz/9TI41/hTCvznF/loiSzn8w==",
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-PRp66vJ865SSqOlgqS8hujT5U4AOgMfhrwYIuIhfKaoSCZcirrmASQr8CX7cUg+RMih+hgznrjp99o+W4pJLHQ==",
"dependencies": {
"jws": "^3.2.2",
"lodash.includes": "^4.3.0",
@@ -1389,11 +1412,11 @@
"lodash.isstring": "^4.0.1",
"lodash.once": "^4.0.0",
"ms": "^2.1.1",
"semver": "^5.6.0"
"semver": "^7.5.4"
},
"engines": {
"node": ">=4",
"npm": ">=1.4.28"
"node": ">=12",
"npm": ">=6"
}
},
"node_modules/jsprim": {
@@ -1411,14 +1434,14 @@
}
},
"node_modules/jszip": {
"version": "3.7.1",
"resolved": "https://registry.npmjs.org/jszip/-/jszip-3.7.1.tgz",
"integrity": "sha512-ghL0tz1XG9ZEmRMcEN2vt7xabrDdqHHeykgARpmZ0BiIctWxM47Vt63ZO2dnp4QYt/xJVLLy5Zv1l/xRdh2byg==",
"version": "3.10.1",
"resolved": "https://registry.npmjs.org/jszip/-/jszip-3.10.1.tgz",
"integrity": "sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g==",
"dependencies": {
"lie": "~3.3.0",
"pako": "~1.0.2",
"readable-stream": "~2.3.6",
"set-immediate-shim": "~1.0.1"
"setimmediate": "^1.0.5"
}
},
"node_modules/jwa": {
@@ -1498,37 +1521,37 @@
"node_modules/lodash.includes": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/lodash.includes/-/lodash.includes-4.3.0.tgz",
"integrity": "sha1-YLuYqHy5I8aMoeUTJUgzFISfVT8="
"integrity": "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="
},
"node_modules/lodash.isboolean": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz",
"integrity": "sha1-bC4XHbKiV82WgC/UOwGyDV9YcPY="
"integrity": "sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg=="
},
"node_modules/lodash.isinteger": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/lodash.isinteger/-/lodash.isinteger-4.0.4.tgz",
"integrity": "sha1-YZwK89A/iwTDH1iChAt3sRzWg0M="
"integrity": "sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA=="
},
"node_modules/lodash.isnumber": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/lodash.isnumber/-/lodash.isnumber-3.0.3.tgz",
"integrity": "sha1-POdoEMWSjQM1IwGsKHMX8RwLH/w="
"integrity": "sha512-QYqzpfwO3/CWf3XP+Z+tkQsfaLL/EnUlXWVkIk5FUPc4sBdTehEqZONuyRt2P67PXAk+NXmTBcc97zw9t1FQrw=="
},
"node_modules/lodash.isplainobject": {
"version": "4.0.6",
"resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha1-fFJqUtibRcRcxpC4gWO+BJf1UMs="
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA=="
},
"node_modules/lodash.isstring": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/lodash.isstring/-/lodash.isstring-4.0.1.tgz",
"integrity": "sha1-1SfftUVuynzJu5XV2ur4i6VKVFE="
"integrity": "sha512-0wJxfxH1wgO3GrbuP+dTTk7op+6L41QCXbGINEmD+ny/G/eCqGzxyCsh7159S+mgDDcoarnBw6PC1PS5+wUGgw=="
},
"node_modules/lodash.once": {
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/lodash.once/-/lodash.once-4.1.1.tgz",
"integrity": "sha1-DdOXEhPHxW34gJd9UEyI+0cal6w="
"integrity": "sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg=="
},
"node_modules/lru-cache": {
"version": "6.0.0",
@@ -1556,9 +1579,9 @@
}
},
"node_modules/make-dir/node_modules/semver": {
"version": "6.3.0",
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
"integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==",
"version": "6.3.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
"integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
"bin": {
"semver": "bin/semver.js"
}
@@ -1637,9 +1660,9 @@
}
},
"node_modules/minimatch": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.4.tgz",
"integrity": "sha512-yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==",
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
"dependencies": {
"brace-expansion": "^1.1.7"
},
@@ -1779,9 +1802,9 @@
}
},
"node_modules/nunjucks": {
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/nunjucks/-/nunjucks-3.2.3.tgz",
"integrity": "sha512-psb6xjLj47+fE76JdZwskvwG4MYsQKXUtMsPh6U0YMvmyjRtKRFcxnlXGWglNybtNTNVmGdp94K62/+NjF5FDQ==",
"version": "3.2.4",
"resolved": "https://registry.npmjs.org/nunjucks/-/nunjucks-3.2.4.tgz",
"integrity": "sha512-26XRV6BhkgK0VOxfbU5cQI+ICFUtMLixv1noZn1tGU38kQH5A5nmmbk/O45xdyBhD1esk47nKrY0mvQpZIhRjQ==",
"dependencies": {
"a-sync-waterfall": "^1.0.0",
"asap": "^2.0.3",
@@ -1792,6 +1815,14 @@
},
"engines": {
"node": ">= 6.9.0"
},
"peerDependencies": {
"chokidar": "^3.3.0"
},
"peerDependenciesMeta": {
"chokidar": {
"optional": true
}
}
},
"node_modules/nwsapi": {
@@ -2078,6 +2109,11 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/querystringify": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/querystringify/-/querystringify-2.2.0.tgz",
"integrity": "sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ=="
},
"node_modules/range-parser": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
@@ -5458,9 +5494,9 @@
}
},
"node_modules/request/node_modules/qs": {
"version": "6.5.2",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.5.2.tgz",
"integrity": "sha512-N5ZAX4/LxJmF+7wN74pUD6qAh9/wnvdQcjq9TZjevvXzSUo7bfmw91saqMjzGS2xq91/odN2dW/WOl7qQHNDGA==",
"version": "6.5.3",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.5.3.tgz",
"integrity": "sha512-qxXIEh4pCGfHICj1mAJQ2/2XVZkjCDTcEgfoSQxc/fYivUZxTkk7L3bDBJSoNrEzXI17oUO5Dp07ktqE5KzczA==",
"engines": {
"node": ">=0.6"
}
@@ -5477,6 +5513,11 @@
"node": ">=0.8"
}
},
"node_modules/requires-port": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/requires-port/-/requires-port-1.0.0.tgz",
"integrity": "sha512-KigOCHcocU3XODJxsu8i/j8T9tzT4adHiecwORRQ0ZZFcp7ahwXuRU1m+yuO90C5ZUyGeGfocHDI14M3L3yDAQ=="
},
"node_modules/rimraf": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-3.0.2.tgz",
@@ -5526,11 +5567,17 @@
}
},
"node_modules/semver": {
"version": "5.7.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-5.7.1.tgz",
"integrity": "sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==",
"version": "7.5.4",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.5.4.tgz",
"integrity": "sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA==",
"dependencies": {
"lru-cache": "^6.0.0"
},
"bin": {
"semver": "bin/semver"
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/send": {
@@ -5593,13 +5640,10 @@
"resolved": "https://registry.npmjs.org/set-blocking/-/set-blocking-2.0.0.tgz",
"integrity": "sha512-KiKBS8AnWGEyLzofFfmvKwpdPzqiy16LvQfK3yv/fVH7Bj13/wl3JSR1J+rfgRE9q7xUJK4qvgS8raSOeLUehw=="
},
"node_modules/set-immediate-shim": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/set-immediate-shim/-/set-immediate-shim-1.0.1.tgz",
"integrity": "sha1-SysbJ+uAip+NzEgaWOXlb1mfP2E=",
"engines": {
"node": ">=0.10.0"
}
"node_modules/setimmediate": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/setimmediate/-/setimmediate-1.0.5.tgz",
"integrity": "sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA=="
},
"node_modules/setprototypeof": {
"version": "1.2.0",
@@ -5699,6 +5743,11 @@
"node": ">= 0.6"
}
},
"node_modules/stream-transform": {
"version": "3.2.8",
"resolved": "https://registry.npmjs.org/stream-transform/-/stream-transform-3.2.8.tgz",
"integrity": "sha512-NUx0mBuI63KbNEEh9Yj0OzKB7iMOSTpkuODM2G7By+TTVihEIJ0cYp5X+pq/TdJRlsznt6CYR8HqxexyC6/bTw=="
},
"node_modules/string_decoder": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
@@ -5796,13 +5845,14 @@
}
},
"node_modules/tough-cookie": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.0.0.tgz",
"integrity": "sha512-tHdtEpQCMrc1YLrMaqXXcj6AxhYi/xgit6mZu1+EDWUn+qhUf8wMQoFIy9NXuq23zAwtcB0t/MjACGR18pcRbg==",
"version": "4.1.3",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.1.3.tgz",
"integrity": "sha512-aX/y5pVRkfRnfmuX+OdbSdXvPe6ieKX/G2s7e98f4poJHnqH3281gDPm/metm6E/WRamfx7WC4HUqkWHfQHprw==",
"dependencies": {
"psl": "^1.1.33",
"punycode": "^2.1.1",
"universalify": "^0.1.2"
"universalify": "^0.2.0",
"url-parse": "^1.5.3"
},
"engines": {
"node": ">=6"
@@ -5880,9 +5930,9 @@
"integrity": "sha512-Tfay0l6gJMP5rkil8CzGbLthukn+9BN/VXWcABVFPjOoelJ+koW8BuPZYk+h/L+lEeIp1fSzVRiWRPIjKVjPdg=="
},
"node_modules/universalify": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.1.2.tgz",
"integrity": "sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg==",
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.2.0.tgz",
"integrity": "sha512-CJ1QgKmNg3CwvAv/kOFmtnEN05f0D/cn9QntgNOQlQF9dgvVTHj3t+8JPdjqawCHk7V/KA+fbUqzZ9XWhcqPUg==",
"engines": {
"node": ">= 4.0.0"
}
@@ -5911,6 +5961,15 @@
"node": ">=6"
}
},
"node_modules/url-parse": {
"version": "1.5.10",
"resolved": "https://registry.npmjs.org/url-parse/-/url-parse-1.5.10.tgz",
"integrity": "sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ==",
"dependencies": {
"querystringify": "^2.1.1",
"requires-port": "^1.0.0"
}
},
"node_modules/util-deprecate": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
@@ -6015,9 +6074,9 @@
}
},
"node_modules/word-wrap": {
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.3.tgz",
"integrity": "sha512-Hz/mrNwitNRh/HUAtM/VT/5VH+ygD6DV7mYKZAtHOrbs8U7lvPS6xf7EJKMF0uW1KJCl0H701g3ZGus+muE5vQ==",
"version": "1.2.5",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
"integrity": "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA==",
"engines": {
"node": ">=0.10.0"
}
@@ -6094,16 +6153,6 @@
"rimraf": "^3.0.2",
"semver": "^7.3.5",
"tar": "^6.1.11"
},
"dependencies": {
"semver": {
"version": "7.5.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.5.1.tgz",
"integrity": "sha512-Wvss5ivl8TMRZXXESstBA4uR5iXgEN/VC5/sOcuXdVLzcdkz4HWetIoRfG5gb5X+ij/G9rw9YoGn3QoQ8OCSpw==",
"requires": {
"lru-cache": "^6.0.0"
}
}
}
},
"@tootallnate/once": {
@@ -6111,6 +6160,19 @@
"resolved": "https://registry.npmjs.org/@tootallnate/once/-/once-1.1.2.tgz",
"integrity": "sha512-RbzJvlNzmRq5c3O09UipeuXno4tA1FE6ikOjxZK0tuxVv3412l64l5t1W5pj4+rJq9vpkm/kwiR07aZXnsKPxw=="
},
"@types/jsonwebtoken": {
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/@types/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-drE6uz7QBKq1fYqqoFKTDRdFCPHd5TCub75BM+D+cMx7NU9hUz7SESLfC2fSCXVFMO5Yj8sOWHuGqPgjc+fz0Q==",
"requires": {
"@types/node": "*"
}
},
"@types/node": {
"version": "20.6.0",
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.6.0.tgz",
"integrity": "sha512-najjVq5KN2vsH2U/xyh2opaSEz6cZMR2SetLIlxlj08nOcmPOemJmUK2o4kUzfLqfrWE0PIrNeE16XhYDd3nqg=="
},
"a-sync-waterfall": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/a-sync-waterfall/-/a-sync-waterfall-1.0.1.tgz",
@@ -6223,11 +6285,6 @@
"resolved": "https://registry.npmjs.org/assert-plus/-/assert-plus-1.0.0.tgz",
"integrity": "sha1-8S4PPF13sLHN2RRpQuTpbB5N1SU="
},
"async": {
"version": "1.5.2",
"resolved": "https://registry.npmjs.org/async/-/async-1.5.2.tgz",
"integrity": "sha1-7GphrlZIDAw8skHJVhjiCJL5Zyo="
},
"asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
@@ -6361,7 +6418,7 @@
"buffer-equal-constant-time": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz",
"integrity": "sha1-+OcRMvf/5uAaXJaXpMbz5I1cyBk="
"integrity": "sha512-zRpUiDwd/xk6ADqPMATG8vc9VPrkck7T07OIx0gnjmJAnHnTVXNQG3vfvWNuiZIkwu9KrKdA1iJKfsfTVxE6NA=="
},
"buffer-writer": {
"version": "2.0.0",
@@ -6501,6 +6558,32 @@
}
}
},
"csv": {
"version": "6.3.3",
"resolved": "https://registry.npmjs.org/csv/-/csv-6.3.3.tgz",
"integrity": "sha512-TuOM1iZgdDiB6IuwJA8oqeu7g61d9CU9EQJGzCJ1AE03amPSh/UK5BMjAVx+qZUBb/1XEo133WHzWSwifa6Yqw==",
"requires": {
"csv-generate": "^4.2.8",
"csv-parse": "^5.5.0",
"csv-stringify": "^6.4.2",
"stream-transform": "^3.2.8"
}
},
"csv-generate": {
"version": "4.2.8",
"resolved": "https://registry.npmjs.org/csv-generate/-/csv-generate-4.2.8.tgz",
"integrity": "sha512-qQ5CUs4I58kfo90EDBKjdp0SpJ3xWnN1Xk1lZ1ITvfvMtNRf+jrEP8tNPeEPiI9xJJ6Bd/km/1hMjyYlTpY42g=="
},
"csv-parse": {
"version": "5.5.0",
"resolved": "https://registry.npmjs.org/csv-parse/-/csv-parse-5.5.0.tgz",
"integrity": "sha512-RxruSK3M4XgzcD7Trm2wEN+SJ26ChIb903+IWxNOcB5q4jT2Cs+hFr6QP39J05EohshRFEvyzEBoZ/466S2sbw=="
},
"csv-stringify": {
"version": "6.4.2",
"resolved": "https://registry.npmjs.org/csv-stringify/-/csv-stringify-6.4.2.tgz",
"integrity": "sha512-DXIdnnCUQYjDKTu6TgCSzRDiAuLxDjhl4ErFP9FGMF3wzBGOVMg9bZTLaUcYtuvhXgNbeXPKeaRfpgyqE4xySw=="
},
"d3-queue": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/d3-queue/-/d3-queue-2.0.3.tgz",
@@ -6734,20 +6817,19 @@
}
},
"express-jwt": {
"version": "6.1.1",
"resolved": "https://registry.npmjs.org/express-jwt/-/express-jwt-6.1.1.tgz",
"integrity": "sha512-m8gkY04v5jtiFZn6bYQINYX/DVXq1DVb5nIW7H8l87qJ4BBvtQKFRpxyRE31odct7OPfHdT+B8678zJHhlMrpw==",
"version": "8.4.1",
"resolved": "https://registry.npmjs.org/express-jwt/-/express-jwt-8.4.1.tgz",
"integrity": "sha512-IZoZiDv2yZJAb3QrbaSATVtTCYT11OcqgFGoTN4iKVyN6NBkBkhtVIixww5fmakF0Upt5HfOxJuS6ZmJVeOtTQ==",
"requires": {
"async": "^1.5.0",
"express-unless": "^1.0.0",
"jsonwebtoken": "^8.1.0",
"lodash": "^4.17.21"
"@types/jsonwebtoken": "^9",
"express-unless": "^2.1.3",
"jsonwebtoken": "^9.0.0"
}
},
"express-unless": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/express-unless/-/express-unless-1.0.0.tgz",
"integrity": "sha512-zXSSClWBPfcSYjg0hcQNompkFN/MxQQ53eyrzm9BYgik2ut2I7PxAf2foVqBRMYCwWaZx/aWodi+uk76npdSAw=="
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/express-unless/-/express-unless-2.1.3.tgz",
"integrity": "sha512-wj4tLMyCVYuIIKHGt0FhCtIViBcwzWejX0EjNxveAa6dG+0XBCQhMbx+PnkLkFCxLC69qoFrxds4pIyL88inaQ=="
},
"extend": {
"version": "3.0.2",
@@ -7132,9 +7214,9 @@
"integrity": "sha1-P02uSpH6wxX3EGL4UhzCOfE2YoA="
},
"jsonwebtoken": {
"version": "8.5.1",
"resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-8.5.1.tgz",
"integrity": "sha512-XjwVfRS6jTMsqYs0EsuJ4LGxXV14zQybNd4L2r0UvbVnSF9Af8x7p5MzbJ90Ioz/9TI41/hTCvznF/loiSzn8w==",
"version": "9.0.2",
"resolved": "https://registry.npmjs.org/jsonwebtoken/-/jsonwebtoken-9.0.2.tgz",
"integrity": "sha512-PRp66vJ865SSqOlgqS8hujT5U4AOgMfhrwYIuIhfKaoSCZcirrmASQr8CX7cUg+RMih+hgznrjp99o+W4pJLHQ==",
"requires": {
"jws": "^3.2.2",
"lodash.includes": "^4.3.0",
@@ -7145,7 +7227,7 @@
"lodash.isstring": "^4.0.1",
"lodash.once": "^4.0.0",
"ms": "^2.1.1",
"semver": "^5.6.0"
"semver": "^7.5.4"
}
},
"jsprim": {
@@ -7160,14 +7242,14 @@
}
},
"jszip": {
"version": "3.7.1",
"resolved": "https://registry.npmjs.org/jszip/-/jszip-3.7.1.tgz",
"integrity": "sha512-ghL0tz1XG9ZEmRMcEN2vt7xabrDdqHHeykgARpmZ0BiIctWxM47Vt63ZO2dnp4QYt/xJVLLy5Zv1l/xRdh2byg==",
"version": "3.10.1",
"resolved": "https://registry.npmjs.org/jszip/-/jszip-3.10.1.tgz",
"integrity": "sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g==",
"requires": {
"lie": "~3.3.0",
"pako": "~1.0.2",
"readable-stream": "~2.3.6",
"set-immediate-shim": "~1.0.1"
"setimmediate": "^1.0.5"
}
},
"jwa": {
@@ -7243,37 +7325,37 @@
"lodash.includes": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/lodash.includes/-/lodash.includes-4.3.0.tgz",
"integrity": "sha1-YLuYqHy5I8aMoeUTJUgzFISfVT8="
"integrity": "sha512-W3Bx6mdkRTGtlJISOvVD/lbqjTlPPUDTMnlXZFnVwi9NKJ6tiAk6LVdlhZMm17VZisqhKcgzpO5Wz91PCt5b0w=="
},
"lodash.isboolean": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/lodash.isboolean/-/lodash.isboolean-3.0.3.tgz",
"integrity": "sha1-bC4XHbKiV82WgC/UOwGyDV9YcPY="
"integrity": "sha512-Bz5mupy2SVbPHURB98VAcw+aHh4vRV5IPNhILUCsOzRmsTmSQ17jIuqopAentWoehktxGd9e/hbIXq980/1QJg=="
},
"lodash.isinteger": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/lodash.isinteger/-/lodash.isinteger-4.0.4.tgz",
"integrity": "sha1-YZwK89A/iwTDH1iChAt3sRzWg0M="
"integrity": "sha512-DBwtEWN2caHQ9/imiNeEA5ys1JoRtRfY3d7V9wkqtbycnAmTvRRmbHKDV4a0EYc678/dia0jrte4tjYwVBaZUA=="
},
"lodash.isnumber": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/lodash.isnumber/-/lodash.isnumber-3.0.3.tgz",
"integrity": "sha1-POdoEMWSjQM1IwGsKHMX8RwLH/w="
"integrity": "sha512-QYqzpfwO3/CWf3XP+Z+tkQsfaLL/EnUlXWVkIk5FUPc4sBdTehEqZONuyRt2P67PXAk+NXmTBcc97zw9t1FQrw=="
},
"lodash.isplainobject": {
"version": "4.0.6",
"resolved": "https://registry.npmjs.org/lodash.isplainobject/-/lodash.isplainobject-4.0.6.tgz",
"integrity": "sha1-fFJqUtibRcRcxpC4gWO+BJf1UMs="
"integrity": "sha512-oSXzaWypCMHkPC3NvBEaPHf0KsA5mvPrOPgQWDsbg8n7orZ290M0BmC/jgRZ4vcJ6DTAhjrsSYgdsW/F+MFOBA=="
},
"lodash.isstring": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/lodash.isstring/-/lodash.isstring-4.0.1.tgz",
"integrity": "sha1-1SfftUVuynzJu5XV2ur4i6VKVFE="
"integrity": "sha512-0wJxfxH1wgO3GrbuP+dTTk7op+6L41QCXbGINEmD+ny/G/eCqGzxyCsh7159S+mgDDcoarnBw6PC1PS5+wUGgw=="
},
"lodash.once": {
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/lodash.once/-/lodash.once-4.1.1.tgz",
"integrity": "sha1-DdOXEhPHxW34gJd9UEyI+0cal6w="
"integrity": "sha512-Sb487aTOCr9drQVL8pIxOzVhafOjZN9UU54hiN8PU3uAiSV7lx1yYNpbNmex2PK6dSJoNTSJUUswT651yww3Mg=="
},
"lru-cache": {
"version": "6.0.0",
@@ -7292,9 +7374,9 @@
},
"dependencies": {
"semver": {
"version": "6.3.0",
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz",
"integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw=="
"version": "6.3.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
"integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA=="
}
}
},
@@ -7342,9 +7424,9 @@
"integrity": "sha512-wXqjST+SLt7R009ySCglWBCFpjUygmCIfD790/kVbiGmUgfYGuB14PiTd5DwVxSV4NcYHjzMkoj5LjQZwTQLEA=="
},
"minimatch": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.4.tgz",
"integrity": "sha512-yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==",
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
"requires": {
"brace-expansion": "^1.1.7"
}
@@ -7447,9 +7529,9 @@
}
},
"nunjucks": {
"version": "3.2.3",
"resolved": "https://registry.npmjs.org/nunjucks/-/nunjucks-3.2.3.tgz",
"integrity": "sha512-psb6xjLj47+fE76JdZwskvwG4MYsQKXUtMsPh6U0YMvmyjRtKRFcxnlXGWglNybtNTNVmGdp94K62/+NjF5FDQ==",
"version": "3.2.4",
"resolved": "https://registry.npmjs.org/nunjucks/-/nunjucks-3.2.4.tgz",
"integrity": "sha512-26XRV6BhkgK0VOxfbU5cQI+ICFUtMLixv1noZn1tGU38kQH5A5nmmbk/O45xdyBhD1esk47nKrY0mvQpZIhRjQ==",
"requires": {
"a-sync-waterfall": "^1.0.0",
"asap": "^2.0.3",
@@ -7668,6 +7750,11 @@
"resolved": "https://registry.npmjs.org/qs/-/qs-6.9.7.tgz",
"integrity": "sha512-IhMFgUmuNpyRfxA90umL7ByLlgRXu6tIfKPpF5TmcfRLlLCckfP/g3IQmju6jjpu+Hh8rA+2p6A27ZSPOOHdKw=="
},
"querystringify": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/querystringify/-/querystringify-2.2.0.tgz",
"integrity": "sha512-FIqgj2EUvTa7R50u0rGsyTftzjYmv/a3hO345bZNrqabNqjtgiDMgmo4mkUjd+nzU5oF3dClKqFIPUKybUyqoQ=="
},
"range-parser": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
@@ -10500,9 +10587,9 @@
"integrity": "sha512-XRsRjdf+j5ml+y/6GKHPZbrF/8p2Yga0JPtdqTIY2Xe5ohJPD9saDJJLPvp9+NSBprVvevdXZybnj2cv8OEd0A=="
},
"qs": {
"version": "6.5.2",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.5.2.tgz",
"integrity": "sha512-N5ZAX4/LxJmF+7wN74pUD6qAh9/wnvdQcjq9TZjevvXzSUo7bfmw91saqMjzGS2xq91/odN2dW/WOl7qQHNDGA=="
"version": "6.5.3",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.5.3.tgz",
"integrity": "sha512-qxXIEh4pCGfHICj1mAJQ2/2XVZkjCDTcEgfoSQxc/fYivUZxTkk7L3bDBJSoNrEzXI17oUO5Dp07ktqE5KzczA=="
},
"tough-cookie": {
"version": "2.5.0",
@@ -10515,6 +10602,11 @@
}
}
},
"requires-port": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/requires-port/-/requires-port-1.0.0.tgz",
"integrity": "sha512-KigOCHcocU3XODJxsu8i/j8T9tzT4adHiecwORRQ0ZZFcp7ahwXuRU1m+yuO90C5ZUyGeGfocHDI14M3L3yDAQ=="
},
"rimraf": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-3.0.2.tgz",
@@ -10552,9 +10644,12 @@
}
},
"semver": {
"version": "5.7.1",
"resolved": "https://registry.npmjs.org/semver/-/semver-5.7.1.tgz",
"integrity": "sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ=="
"version": "7.5.4",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.5.4.tgz",
"integrity": "sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA==",
"requires": {
"lru-cache": "^6.0.0"
}
},
"send": {
"version": "0.17.2",
@@ -10614,10 +10709,10 @@
"resolved": "https://registry.npmjs.org/set-blocking/-/set-blocking-2.0.0.tgz",
"integrity": "sha512-KiKBS8AnWGEyLzofFfmvKwpdPzqiy16LvQfK3yv/fVH7Bj13/wl3JSR1J+rfgRE9q7xUJK4qvgS8raSOeLUehw=="
},
"set-immediate-shim": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/set-immediate-shim/-/set-immediate-shim-1.0.1.tgz",
"integrity": "sha1-SysbJ+uAip+NzEgaWOXlb1mfP2E="
"setimmediate": {
"version": "1.0.5",
"resolved": "https://registry.npmjs.org/setimmediate/-/setimmediate-1.0.5.tgz",
"integrity": "sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA=="
},
"setprototypeof": {
"version": "1.2.0",
@@ -10688,6 +10783,11 @@
"resolved": "https://registry.npmjs.org/statuses/-/statuses-1.5.0.tgz",
"integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow="
},
"stream-transform": {
"version": "3.2.8",
"resolved": "https://registry.npmjs.org/stream-transform/-/stream-transform-3.2.8.tgz",
"integrity": "sha512-NUx0mBuI63KbNEEh9Yj0OzKB7iMOSTpkuODM2G7By+TTVihEIJ0cYp5X+pq/TdJRlsznt6CYR8HqxexyC6/bTw=="
},
"string_decoder": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
@@ -10758,13 +10858,14 @@
"integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="
},
"tough-cookie": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.0.0.tgz",
"integrity": "sha512-tHdtEpQCMrc1YLrMaqXXcj6AxhYi/xgit6mZu1+EDWUn+qhUf8wMQoFIy9NXuq23zAwtcB0t/MjACGR18pcRbg==",
"version": "4.1.3",
"resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-4.1.3.tgz",
"integrity": "sha512-aX/y5pVRkfRnfmuX+OdbSdXvPe6ieKX/G2s7e98f4poJHnqH3281gDPm/metm6E/WRamfx7WC4HUqkWHfQHprw==",
"requires": {
"psl": "^1.1.33",
"punycode": "^2.1.1",
"universalify": "^0.1.2"
"universalify": "^0.2.0",
"url-parse": "^1.5.3"
},
"dependencies": {
"punycode": {
@@ -10825,9 +10926,9 @@
"integrity": "sha512-Tfay0l6gJMP5rkil8CzGbLthukn+9BN/VXWcABVFPjOoelJ+koW8BuPZYk+h/L+lEeIp1fSzVRiWRPIjKVjPdg=="
},
"universalify": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.1.2.tgz",
"integrity": "sha512-rBJeI5CXAlmy1pV+617WB9J63U6XcazHHF2f2dbJix4XzpUF0RS3Zbj0FGIOCAva5P/d/GBOYaACQ1w+0azUkg=="
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-0.2.0.tgz",
"integrity": "sha512-CJ1QgKmNg3CwvAv/kOFmtnEN05f0D/cn9QntgNOQlQF9dgvVTHj3t+8JPdjqawCHk7V/KA+fbUqzZ9XWhcqPUg=="
},
"unpipe": {
"version": "1.0.0",
@@ -10849,6 +10950,15 @@
}
}
},
"url-parse": {
"version": "1.5.10",
"resolved": "https://registry.npmjs.org/url-parse/-/url-parse-1.5.10.tgz",
"integrity": "sha512-WypcfiRhfeUP9vvF0j6rw0J3hrWrw6iZv3+22h6iRMJ/8z1Tj6XfLP4DsUix5MhMPnXpiHDoKyoZ/bdCkwBCiQ==",
"requires": {
"querystringify": "^2.1.1",
"requires-port": "^1.0.0"
}
},
"util-deprecate": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
@@ -10932,9 +11042,9 @@
}
},
"word-wrap": {
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.3.tgz",
"integrity": "sha512-Hz/mrNwitNRh/HUAtM/VT/5VH+ygD6DV7mYKZAtHOrbs8U7lvPS6xf7EJKMF0uW1KJCl0H701g3ZGus+muE5vQ=="
"version": "1.2.5",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
"integrity": "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA=="
},
"wrappy": {
"version": "1.0.2",

View File

@@ -11,11 +11,11 @@
"license": "UNLICENSED",
"private": true,
"config": {
"db_schema": "^0.3.11",
"api": "^0.3.0"
"db_schema": "^0.4.2",
"api": "^0.4.0"
},
"engines": {
"node": ">=14.0.0"
"node": ">=18.0.0"
},
"os": [
"linux"
@@ -23,11 +23,12 @@
"dependencies": {
"body-parser": "gitlab:aaltronav/contrib/expressjs/body-parser",
"cookie-parser": "^1.4.5",
"csv": "^6.3.3",
"debug": "^4.3.4",
"express": "^4.17.1",
"express-jwt": "^6.0.0",
"express-jwt": "^8.4.1",
"json2csv": "^5.0.6",
"jsonwebtoken": "^8.5.1",
"jsonwebtoken": "^9.0.2",
"leaflet-headless": "git+https://git@gitlab.com/aaltronav/contrib/leaflet-headless.git#devel",
"marked": "^4.0.12",
"netmask": "^2.0.2",

View File

@@ -1,71 +0,0 @@
const { pool } = require('../lib/db/connection');
var client;
const channels = {};
async function notify (data) {
if (data.channel in channels) {
data._received = new Date();
try {
const json = JSON.parse(data.payload);
data.payload = json;
} catch {
// Ignore the error
}
for (const listener of channels[data.channel]) {
await listener(JSON.parse(JSON.stringify(data)));
}
}
}
function reconnect () {
console.log("Reconnecting");
// No need to provide parameters, channels should already be populated.
listen();
}
async function listen (addChannels, callback) {
if (!client) {
try {
client = await pool.connect();
} catch (err) {
console.error("Error connecting to DB", err);
console.log("Will try again in 15 seconds");
setImmediate(() => client = null);
setTimeout(() => {
listen(addChannels, callback);
}, 15000);
return;
}
client.on('notification', notify);
console.log("Websocket client connected", Object.keys(channels));
client.on('error', (err) => console.error("Events client error: ", err));
client.on('end', () => {
console.warn("Websocket events client disconnected. Will attempt to reconnect in five seconds");
setImmediate(() => client = null);
setTimeout(reconnect, 5000);
});
}
if (addChannels) {
if (!Array.isArray(addChannels)) {
addChannels = [addChannels];
}
for (const channel of addChannels) {
if (!(channel in channels)) {
await client.query("LISTEN "+channel);
channels[channel] = [];
console.log("Listening to ", channel);
}
channels[channel].push(callback);
}
}
}
module.exports = {
listen
}

View File

@@ -1,6 +1,6 @@
const ws = require('ws');
const URL = require('url');
const db = require('./db');
const { listen } = require('../lib/db/notify');
const channels = require('../lib/db/channels');
function start (server, pingInterval=30000) {
@@ -22,7 +22,7 @@ function start (server, pingInterval=30000) {
}
});
db.listen(channels, (data) => {
listen(channels, (data) => {
wsServer.clients.forEach( (socket) => {
socket.send(JSON.stringify(data));
})

View File

@@ -16,7 +16,12 @@ OUTPATH="$OUTDIR/$OUTNAME"
# 30000/UDP: Navigation system headers
# Not all inputs will be present in all systems.
#
EXPR="udp and (port 4461 or port 4462 or port 30000)"
# NOTE: $INS_HOST must be defined and point to the
# navigation server. The reason we don't use a port
# filter for this data is because that doesn't work
# with fragmented UDP packets.
#
EXPR="udp and (port 4461 or port 4462 or src host $INS_HOST)"
if [[ ! -d "$OUTDIR" ]]; then
mkdir "$OUTDIR"

42
sbin/rewrite-captures.sh Executable file
View File

@@ -0,0 +1,42 @@
#!/bin/bash
#
# Rewrite packet captures in order to be able to replay them.
#
# SINET: Rewrite all packets with this source IP address
# SETHER: Rewrite all packets with this MAC
#
# DINET: Rewrite all packets with this destination IP address
# DETHER: Rewrite all packets with this destination MAC address
#
# The resulting files have the original name with "-rewritten.pcap"
# appended as a suffix. Those packets may then be replayed from a
# different computer or virtual container, for instance with:
#
# sudo bittwist -i 1 -v -m10 capture-rewritten.pcap
#
# Where -i n is the interface name (use bittwist -d to list available
# interfaces), -v is the verbose flag and -m10 replays at 10× speed.
#
SINET=${SINET:-$(ip -o -4 addr |grep -v " lo " |head -n 1 |sed -r 's/^.*inet\s([0-9.]+).*$/\1/')}
SETHER=${SETHER:-$(ip -o link |grep -v " lo" |head -n 1 |sed -r 's/^.*ether\s([0-9a-fA-F:]+).*$/\1/')}
DINET=${DINET:-$(ip -o -4 addr |grep -v " lo " |head -n 1 |sed -r 's/^.*inet\s([0-9.]+).*$/\1/')}
DETHER=${DETHER:-$(ip -o link |grep -v " lo" |head -n 1 |sed -r 's/^.*ether\s([0-9a-fA-F:]+).*$/\1/')}
for f in $*; do
OUTFNAME=$f-rewritten.pcap
echo $f$OUTFNAME
if [[ -n "$SINET" && -n "$SETHER" ]]; then
tcprewrite -S 0.0.0.0/0:$SINET --enet-smac=$SETHER \
-D 0.0.0.0/0:$DINET --enet-dmac=$DETHER \
--infile "$f" \
--outfile "$OUTFNAME"
else
tcprewrite -D 0.0.0.0/0:$DINET --enet-dmac=$DETHER \
--infile "$f" \
--outfile "$OUTFNAME"
fi
done