Compare commits

..

113 Commits

Author SHA1 Message Date
D. Berge
fd41d2a6fa Launch database housekeeping tasks from runner 2022-05-01 20:10:27 +02:00
D. Berge
39690c991b Update database templates.
* Add index on public.real_time_inputs.meta->>'tstamp'
* Add public.geometry_from_tstamp()
* Add augment_event_data()
2022-05-01 19:47:16 +02:00
D. Berge
09ead4878f Add database upgrade file 17 2022-05-01 19:46:04 +02:00
D. Berge
588d210f24 Fix reporting for “gun pressures” QC test.
Fixes #205.
2022-04-30 17:37:38 +02:00
D. Berge
28be86e7ff Graphs view: delay “no sequences” message until loaded.
Related to #196.
2022-04-30 16:14:32 +02:00
D. Berge
1eac97cbd0 Change “No fire” QC definition 2022-04-30 16:13:12 +02:00
D. Berge
e3a3bdb153 Clean up whitespace.
Commands used:

find . -type f -name '*.js'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.vue'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.py'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
2022-04-29 14:48:21 +02:00
D. Berge
0e534b583c Do not assume that lines have remarks.
Fixes #202.
2022-04-29 14:32:46 +02:00
D. Berge
51480e52ef Recognise "dark", "light" label view attributes.
In a label definition (in etc/surveys/*.yaml) we can now have
"dark" or "light" attributes under "view" to force the label
text to always use either the dark or light theme. This is
useful when a label's colour causes a bad contrast in either
theme.

Example:

  labels:
      Daily:
          view:
              colour: "#EFEBE9"
              description: "Of interest in the daily report"
              light: true # Text always displayed in a dark colour
          model:
              user: true
              multiple: true
2022-04-29 12:18:09 +02:00
D. Berge
187807cfb1 Enable Save button as soon as the remarks are changed.
Closes #199.
2022-04-27 19:45:26 +02:00
D. Berge
d386b97e42 Database upgrade 16: fix event edits.
Fixes #198.
2022-04-27 17:41:53 +02:00
D. Berge
da578d2e50 Fix project_summary view returning unwanted rows.
Fixes #197.
2022-04-27 10:49:46 +02:00
D. Berge
7cf89d48dd Fix whitespace 2022-04-26 17:41:48 +02:00
D. Berge
c0ec8298fa Don't try to show QC graphs on a new project.
If there are no sequences, just show a message to the effect.

Fixes #196.
2022-04-26 17:39:59 +02:00
D. Berge
68322ef562 Fix misleading comment.
Use an EPSG code that is actually in the work area of the Dougal boats.
2022-04-26 17:36:48 +02:00
D. Berge
888228c9a2 Do not crash if a project doesn't have QCs defined.
Fixes #195.
2022-04-26 14:50:34 +02:00
D. Berge
74d6f0b9a0 Accept mime query parameter 2022-04-16 17:18:04 +02:00
D. Berge
cf475ce2df Adapt middleware to new database schema.
As introduced by commit 0c6567d8f8.
2022-04-16 17:18:04 +02:00
D. Berge
26033b2a37 Fix syntax error.
Introduced by commit ead938b40f.
2022-04-13 09:04:52 +02:00
D. Berge
fafd4928d9 Fix Marked call (adapt to new Marked version) 2022-04-13 08:18:21 +02:00
D. Berge
ec38fdb290 Pin package sass version to avoid annoying warning 2022-03-18 20:07:50 +01:00
D. Berge
086172c5e7 Upgrade dependencies.
This is a conservative upgrade.

The upgraded version of leaflet-arrowheads uses optional chaining which
seems to cause webpack to choke, so added to "transpileDependencies" in
vue.config.js.

Closes #189.
2022-03-18 16:29:50 +01:00
D. Berge
3db453a271 Add keys to v-for loops 2022-03-18 16:15:06 +01:00
D. Berge
a5db9c984b Show sequence comments in log page 2022-03-18 15:05:08 +01:00
D. Berge
ead938b40f Inhibit exports.
They don't seem to be used, and for backups it's better to
just back up the whole database instead, which is being done
remotely.
2022-03-18 13:32:43 +01:00
D. Berge
634a7be3f1 Merge branch '184-refactor-qcs' into devel 2022-03-17 20:12:15 +01:00
D. Berge
913606e7f1 Allow forcing QCs.
QCs may be re-run for specific sequences or for a whole
project by defining an environment variable, as follows:

For an entire project:

* DOUGAL_FORCE_QC="project-id"

For specific sequences:

* DOUGAL_FORCE_QC="project-id sequence1 sequence2 … sequenceN"
2022-03-17 20:10:26 +01:00
D. Berge
49b7747ded Remove *all* QC events when saving sequence results.
When saving shot-by-shot results for a sequence,
*all* existing QC events for that sequence will be
removed first.

We do this because otherwise we may end up with QC
data for shots that no longer exist. Also, in the
case that we have QCed based on raw data, QC results
for shots which are not in the final data would stay
around even though those shots are no longer valid.
2022-03-17 20:07:11 +01:00
D. Berge
1fd265cc74 Update dependencies 2022-03-17 20:05:07 +01:00
D. Berge
13389706a9 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:43:38 +01:00
D. Berge
818cd8b070 Add pg-cursor dependency, needed by QCs 2022-03-17 18:43:12 +01:00
D. Berge
a3d3c7aea7 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:37:14 +01:00
D. Berge
a592ab5f6c Use digests rather than timestamps for QC execution.
Using timestamps does not work as we might be
importing files with timestamps older than the
last QC run. Those would not be detected by a
timestamp based method but would be by this
digest based approach.

There is a project-wide digest and per sequence
digests. The former takes the path and hashes of
all files known to Dougal for this project (the
`files` table), concatenantes them and computes
the MD5 checksum. Sequence digests do the same
but only including the files related to that
sequence.
2022-03-17 18:32:09 +01:00
D. Berge
9b571ce34d Merge branch '138-keep-edit-history-of-event-log-entries' into devel 2022-03-16 21:31:38 +01:00
D. Berge
aa2b158088 Remove spurious actions from DB template 2022-03-16 21:30:32 +01:00
D. Berge
0d1f2b207c Apply changes from 38e4e705a4 to DB schema template 2022-03-16 21:29:53 +01:00
D. Berge
38e4e705a4 Modify database upgrade file 12.
Two function that were dependent on the `events` view were
changed to work with `event_log` instead.
2022-03-16 21:08:42 +01:00
D. Berge
82d7036860 Merge branch '138-keep-edit-history-of-event-log-entries' into 'devel'
Resolve "Keep edit history of event log entries"

Closes #78, #101, #138, #141, #170, #172, and #181

See merge request wgp/dougal/software!20
2022-03-15 13:25:43 +00:00
D. Berge
0727e7db69 Update database templates to schema v0.3.1 2022-03-15 14:17:28 +01:00
D. Berge
2484b1c473 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:37:27 +01:00
D. Berge
750beb5c02 Add explicit indication of all tests passed 2022-03-09 21:36:49 +01:00
D. Berge
cd2e7bbd0f Merge branch '184-refactor-qcs' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:26:40 +01:00
D. Berge
21d5383882 Update QC check definitions 2022-03-09 21:25:47 +01:00
D. Berge
2ec484da41 Fix detection of sequence modification time 2022-03-09 21:25:04 +01:00
D. Berge
648ce9970f Interpolate timestamps for non-existing shotpoints 2022-03-09 21:22:33 +01:00
D. Berge
fd278a5ee6 Add database function: tstamp_interpolate 2022-03-09 21:21:48 +01:00
D. Berge
4f5cce33fc Add comments to database functions 2022-03-09 21:21:01 +01:00
D. Berge
53bb75a2c1 Add new database upgrade file 11.
Some of the things in new upgrade file 12 depend
on the functions defined here.
2022-03-09 19:07:58 +01:00
D. Berge
45595bd64f Rename database upgrades 11‒13 → 12‒14 2022-03-09 19:07:58 +01:00
D. Berge
af4d141c6a Merge branch '184-refactor-qcs' into '138-keep-edit-history-of-event-log-entries'
Resolve "Refactor QCs"

See merge request wgp/dougal/software!22
2022-03-09 17:46:20 +00:00
D. Berge
bef2be10d2 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into '184-refactor-qcs'
Resolve "Adapt QC results view to new API endpoints"

See merge request wgp/dougal/software!24
2022-03-09 16:56:35 +00:00
D. Berge
803a08a736 Merge branch '187-create-qc-results-api-endpoints' into '184-refactor-qcs'
Resolve "Create QC results API endpoints"

See merge request wgp/dougal/software!23
2022-03-09 16:55:57 +00:00
D. Berge
c86cbdc493 Refactor QC view to use new API endpoint.
This provides essentially the same user experience as the old
endpoint, with one exception as of this commit:

* The user is not able to “accept” or “unaccept” QC events.
2022-03-09 17:50:55 +01:00
D. Berge
186615d988 Add comments for ease of browsing 2022-03-09 17:43:51 +01:00
D. Berge
666f91de18 Add QC results API endpoint 2022-03-09 17:43:10 +01:00
D. Berge
c8ce786e39 Add API middleware for returning QC results 2022-03-09 17:41:27 +01:00
D. Berge
73cb26551b Add library functions for getting QC results from DB.
We return the QC definitions tree structure, augmented with
a `sequences` attribute which contains `raw_lines` tuples
which are in turn augmented with a `shots` attribute
containing `event_log` tuples. The whole structure looks
something like:

qc_test:
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
      - sequence1:
          shots: [sp0, sp1, …]
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
  …
2022-03-09 17:35:12 +01:00
D. Berge
d90acb1aeb Add utility to convert QC definitions tree into a flat list 2022-03-09 17:32:23 +01:00
D. Berge
14a2f57c8d Refactor QC execution and results saving.
The results are now saved as follows:

For shot QCs, failing tests result in an event being created in
the event_log table. The text of the event is the QC result message,
while the labels are as set in the QC definition. It is conventionally
expected that these include a `QC` label. The event `meta` contains a
`qc_id` attribute with the ID of the failing QC.

For sequences, failing tests result in a `meta` entry under `qc`, with
the QC ID as the key and the result message as the value.

Finally, the project's `info` table still has a `qc` key, but unlike
with the old code, which stored all the QC results in a huge object
under this key, now only the timestamp of the last time a QC was run on
this project is stored, as `{ "updatedOn": timestamp }`.

The QCs are launched by calling the main() function in /lib/qc/index.js.
This function will first check the timestamp of the files imported into
the project and only run QCs if any of the file timestamps are later
than `info.qc.updatedOn`. Likewise, for each sequence, the timestamp of
the files conforming that sequence is checked against
`info.qc.updatedOn` and only those which are newer are actually
processed. This cuts down the running time very considerably.

The logic now is much easier on memory too, as it doesn't load the
whole project at once into memory. Instead, shotpoint QCs are processed
first, and for this a cursor is used, fetching one shotpoint at a
time. Then the sequence QCs are run, also one sequence at a time
(fetched via an individual query touching the `sequences_summary` view,
rather than via a cursor; we reuse some of the lib/db functions here),
for each sequence all its shotpoints and a list of missing shots are
also fetched (via lib/db function reuse) and passed to the QC functions
as predefined variables.

The logic of the QC functions is also changed. Now they can return:

* If a QC passes, the function MUST return boolean `true`.

* If a QC fails, the function MAY return a string describing the nature
  of the failure, or in the case of an `iterate: sequence` type test,
  it may return an object with these attributes:

  - `remarks`: a string describing the nature of the failure;
  - `labels`: a set of labels to associate with this failure;
  - `shots`: a object in which each attribute denotes a shotpoint number
    and the value consists of either a string or an object with
`remarks` (string), `labels` (array of strings) attributes. This allows
us to add detail about which shotpoints exactly contribute to cause a
sequence-wide test failure (this may not be applicable to every
sequence-wide QC) and it's also a handy way to detect and insert events
for missing shots.

* For QCs which may give false positives, such as missing gun data, a
  new QC definition attribute is introduced: if `ignoreAllFailed` is
boolean `true` and all shots fail the test for a sequence, or all
sequences fail the test for a prospect, the results of the QC will be
ignored, as if the test had passed. This is mostly to deal with gun or
any other data that may be temporarily missing.
2022-03-07 21:41:10 +01:00
D. Berge
67f8b9c6dd Bypass permissions check on info.put() if role is null.
The comparison is strict non-equality so a null role cannot
be forced via the API.

The need for this is so that we can reuse this function to
save QC results, which is something that does not take
place over the API.
2022-03-07 21:20:21 +01:00
D. Berge
d3336c6cf7 Add fetchRow DB function.
Helper function to fetch a row at a time using a cursor.
2022-03-07 21:16:43 +01:00
D. Berge
17bb88faf4 Cope with P1/11s with no S records 2022-03-07 21:08:22 +01:00
D. Berge
a52c7e91f5 Document in runner.sh how to run ASAQC in test mode 2022-03-07 21:07:20 +01:00
D. Berge
8debe60d5c Cope with undefined labels 2022-03-02 19:39:29 +01:00
D. Berge
ee9a33513a Update database README 2022-02-28 21:27:20 +01:00
D. Berge
723c9cc166 Make it possible to repeatedly apply DB upgrade 11.
Even though this makes PostgreSQL 14 a hard dependency.
2022-02-28 21:26:19 +01:00
D. Berge
cb952d37f7 Fix: do not require file that no longer exists 2022-02-28 21:25:00 +01:00
D. Berge
d5fc04795d Make rows dense.
This should probably be turned into an option controlled by the
user.
2022-02-27 19:59:06 +01:00
D. Berge
4e0737335f Add row context menu.
It replaces the `Actions` column in the old table and provides
more actions.

The user can now edit not just the comments and labels but also
the timestamp / shotpoint as requested in #78 (closes #78).

Because events are grouped by timestamp / shotpoint (each row
represents a unique timestamp or shotpoint), the behaviour is
slightly different depending on whether the user clicks on a
row containing a single (editable) event, or on one of multiple
editable events in the same row. Also, rows containing only
read-only events are recognised and no edition actions are
provided for those.
2022-02-27 19:59:06 +01:00
D. Berge
d47c8a9e10 Add (disabled) active row highlighter.
It implements the same functionality as in other tabs
such as sequences, lines, etc., but it is disabled here
because in my opinion it doesn't look too nice.

It will probably be a matter of enabling it at some point
and asking for feedback on user preference.
2022-02-27 19:56:21 +01:00
D. Berge
7ea0105d9f Add popularLabels computed property.
Returns a list of labels used in the current view,
in order of popularity (most used first).

NOTE: this property is not actually used. It's
technically dead code.
2022-02-27 19:56:21 +01:00
D. Berge
8f4bda011b Add dialogue to edit event labels.
This assumes that adding or removing labels is a relatively
common action to do on an event and provides a quicker
and simpler mechanism than bringing up the full event
dialogue.

This is meant to be invoked from a context menu action or
similar.
2022-02-27 19:56:21 +01:00
D. Berge
48505dbaeb View event history.
When an event has been modified, this control opens a dialogue
where the previous version of the event may be reviewed and if
necessary restored.

Technically, this was the quid of and closes #138.
2022-02-27 19:56:21 +01:00
D. Berge
278c46f975 Adapt events view to new schema 2022-02-27 19:56:21 +01:00
D. Berge
180343754a Remove old event edit dialogue 2022-02-27 19:56:21 +01:00
D. Berge
9aa9ce979b Replace event edit dialogue.
The old <dougal-event-edit-dialog/> gets replaced by
<dougal-event-edit/> which handles the new events schema.
2022-02-27 19:56:21 +01:00
D. Berge
1e5be9c655 Add new event edit dialogue.
Replaces <dougal-event-edit-dialog/>.
2022-02-27 19:56:21 +01:00
D. Berge
0be5dba2b9 Return also labels from <dougal-context-menu/>.
Keeping in mind that the input model is a tree and labels
may be at any level in the tree, not just in the leaves.
2022-02-27 19:56:21 +01:00
D. Berge
0c91e40817 Fix <dougal-context-menu/> default prop value 2022-02-27 19:56:21 +01:00
D. Berge
c1440c7ac8 Simplifiy <dougal-context-menu/> model 2022-02-27 19:56:21 +01:00
D. Berge
606f18c016 Add Vuex position and timestamp getters for real-time event 2022-02-27 19:56:21 +01:00
D. Berge
febf109cce Update API description 2022-02-27 19:56:21 +01:00
D. Berge
9b700ffb46 Update required database schema 2022-02-27 19:56:21 +01:00
D. Berge
9aca927e49 Update version checking mechanism.
Checks both database schema and API versions.
2022-02-27 19:56:21 +01:00
D. Berge
adaa1a6b8a Add version number to API 2022-02-27 19:56:21 +01:00
D. Berge
8790a797d9 Allow restricting by timestamp or position.
Closes #181.
2022-02-27 19:56:21 +01:00
D. Berge
d7d75f34cd Remove event caching.
That was a horrible kludge and should not be necessary with the
new schema, which is simpler and much faster.
2022-02-27 19:56:21 +01:00
D. Berge
950582a5c6 Refactor event middleware and db code to use new tables 2022-02-27 19:56:21 +01:00
D. Berge
d0da1b005b Add replaceMarkers utility function 2022-02-27 19:56:21 +01:00
D. Berge
1e2c816ef3 Add database upgrade file 13.
Drops the old event tables.

NOTE: consider not applying this patch until confident that
the migration has proceeded smoothly. Dougal can operate just
fine without it.
2022-02-27 19:56:21 +01:00
D. Berge
54b457b4ea Add database upgrade file 12.
Migrates data from old event tables to new.
2022-02-27 19:56:21 +01:00
D. Berge
4d2efd1e04 Move sequence events middleware to a different path.
This is to make room for a new endpoint to retrieve
data for individual events.
2022-02-27 19:56:21 +01:00
D. Berge
920ea83ece Add API endpoint to retrieve a single shotpoint.
This will be used by the new event dialogue in the
frontend to get shotpoint information when creating
or editing events.
2022-02-27 19:56:21 +01:00
D. Berge
d33fe4e936 Add database utilities file.
Intended to contain reusable functions.
2022-02-27 19:56:21 +01:00
D. Berge
c347b873c5 Update database README.
Add information on restoring from backup and troubleshooting
details when migrating PostgreSQL versions.
2022-02-27 19:56:21 +01:00
D. Berge
0c6567d8f8 Add database upgrade file 11 2022-02-27 19:56:12 +01:00
D. Berge
195741a768 Merge branch '173-do-not-use-inodes-as-part-of-a-file-s-fingerprint' into 'devel'
Resolve "Do not use inodes as part of a file's fingerprint"

Closes #173

See merge request wgp/dougal/software!19
2022-02-07 16:08:04 +00:00
D. Berge
0ca44c3861 Add database upgrade file 10.
NOTE: this is the first time we modify the actual data
in the database, as opposed to adding to the schema.
2022-02-07 17:05:19 +01:00
D. Berge
53ed096e1b Modify file hashing function.
We remove the inode from the hash as it is unstable when the
files are on an SMB filesystem, and replace it with an MD5
of the absolute file path.
2022-02-07 17:03:10 +01:00
D. Berge
75f91a9553 Increment schema wanted version 2022-02-07 17:02:59 +01:00
D. Berge
40b07c9169 Merge branch '175-add-database-versioning-and-migration-mechanism' into 'devel'
Resolve "Add database versioning and migration mechanism"

Closes #175

See merge request wgp/dougal/software!18
2022-02-07 14:43:50 +00:00
D. Berge
36e7b1fe21 Add database upgrade file 09 2022-02-06 23:26:57 +01:00
D. Berge
e7fa74326d Add README to database upgrades directory 2022-02-06 23:24:24 +01:00
D. Berge
83be83e4bd Check database schema compatibility.
The server will not start unless it satisfies itself that we're
running against a compatible database schema.
2022-02-06 22:52:45 +01:00
D. Berge
81ce6346b9 Add database schema information to package.json.
Used to determine if the actual schema on the database
is compatible with the version of the server we're
attempting to run.
2022-02-06 22:51:25 +01:00
D. Berge
923ff1acea Add more details to package.json 2022-02-06 22:50:44 +01:00
D. Berge
8ec479805a Add version reporting library.
This reports the current server version, from Git by
default.

Also, and of more interest, it reports whether the
current database schema is compatible with the
server code.
2022-02-06 22:48:20 +01:00
D. Berge
f10103d396 Enfore info key access restrictions on the API.
Obviously, those keys can be edited freely at the database
level. This is intended.
2022-02-06 22:40:53 +01:00
D. Berge
774bde7c00 Reserve certain keys on info tables 2022-02-06 22:39:11 +01:00
D. Berge
b4569c14df Update database README.
Document how to create a Dougal database from scratch
and how to update PostgreSQL.
2022-02-06 22:28:21 +01:00
D. Berge
54eea62e4a Fix require path 2022-02-06 14:24:25 +01:00
D. Berge
69c4f2dd9e Merge branch '161-transfer-files-to-asaqc' into 'devel'
Resolve "Transfer files to ASAQC"

Closes #161

See merge request wgp/dougal/software!16
2021-10-09 09:23:54 +00:00
D. Berge
ff4913c0a5 Instrument getLineName to monitor probable cause of #165 2021-10-06 02:12:05 +02:00
163 changed files with 21002 additions and 13541 deletions

View File

@@ -10,7 +10,7 @@
# be known to the database.
# * PROJECT_NAME is a more descriptive name for human consumption.
# * EPSG_CODE is the EPSG code identifying the CRS for the grid data in the
# navigation files, e.g., 32031.
# navigation files, e.g., 23031.
#
# In addition to this, certain other parameters may be controlled via
# environment variables:

View File

@@ -4,6 +4,7 @@ import psycopg2
import configuration
import preplots
import p111
from hashlib import md5 # Because it's good enough
"""
Interface to the PostgreSQL database.
@@ -11,13 +12,16 @@ Interface to the PostgreSQL database.
def file_hash(file):
"""
Calculate a file hash based on its size, inode, modification and creation times.
Calculate a file hash based on its name, size, modification and creation times.
The hash is used to uniquely identify files in the database and detect if they
have changed.
"""
h = md5()
h.update(file.encode())
name_digest = h.hexdigest()[:16]
st = os.stat(file)
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, st.st_ino]])
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, name_digest]])
class Datastore:
"""
@@ -390,9 +394,9 @@ class Datastore:
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
if not records or len(records) == 0:
print("File has no records (or none have been detected)")
# We add the file to the database anyway to signal that we have
@@ -412,13 +416,13 @@ class Datastore:
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr, json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
@@ -452,7 +456,7 @@ class Datastore:
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
qry = """
@@ -462,13 +466,13 @@ class Datastore:
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
@@ -495,7 +499,7 @@ class Datastore:
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
cursor.execute("CALL final_line_post_import(%s);", (fileinfo["sequence"],))
self.maybe_commit()
@@ -662,7 +666,7 @@ class Datastore:
"""
Remove final data for a sequence.
"""
if cursor is None:
cur = self.conn.cursor()
else:
@@ -674,4 +678,20 @@ class Datastore:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def housekeep_event_log(self, cursor = None):
"""
Call housekeeping actions on the event log
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL augment_event_data();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction

25
bin/housekeep_database.py Executable file
View File

@@ -0,0 +1,25 @@
#!/usr/bin/python3
"""
Do housekeeping actions on the database.
"""
import configuration
from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
db.housekeep_event_log()
print("Done")

View File

@@ -59,7 +59,7 @@ def qc_data (cursor, prefix):
else:
print("No QC data found");
return
#print("QC", qc)
index = 0
for item in qc["results"]:

View File

@@ -39,7 +39,7 @@ def seis_data (survey):
if not pathlib.Path(pathPrefix).exists():
print(pathPrefix)
raise ValueError("Export path does not exist")
print(f"Requesting sequences for {survey['id']}")
url = f"http://localhost:3000/api/project/{survey['id']}/sequence"
r = requests.get(url)
@@ -47,12 +47,12 @@ def seis_data (survey):
for sequence in r.json():
if sequence['status'] not in ["final", "ntbp"]:
continue
filename = pathlib.Path(pathPrefix, "sequence{:0>3d}.json".format(sequence['sequence']))
if filename.exists():
print(f"Skipping export for sequence {sequence['sequence']} file already exists")
continue
print(f"Processing sequence {sequence['sequence']}")
url = f"http://localhost:3000/api/project/{survey['id']}/event?sequence={sequence['sequence']}&missing=t"
headers = { "Accept": "application/vnd.seis+json" }

View File

@@ -19,7 +19,7 @@ from datastore import Datastore
def add_pending_remark(db, sequence):
text = '<!-- @@DGL:PENDING@@ --><h4 style="color:red;cursor:help;" title="Edit the sequence file or directory name to import final data">Marked as <code>PENDING</code>.</h4><!-- @@/DGL:PENDING@@ -->\n'
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
@@ -33,18 +33,20 @@ def add_pending_remark(db, sequence):
db.maybe_commit()
def del_pending_remark(db, sequence):
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
remarks = cursor.fetchone()[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is not None:
remarks = rx.sub("",remarks)
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
row = cursor.fetchone()
if row is not None:
remarks = row[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is not None:
remarks = rx.sub("",remarks)
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
if __name__ == '__main__':
@@ -87,12 +89,12 @@ if __name__ == '__main__':
pending = pendingRx.search(filepath) is not None
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
@@ -104,7 +106,7 @@ if __name__ == '__main__':
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
if pending:
print("Skipping / removing final file because marked as PENDING", filepath)
db.del_sequence_final(file_info["sequence"])

View File

@@ -51,12 +51,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

View File

@@ -31,12 +31,12 @@ if __name__ == '__main__':
for file in survey["preplots"]:
print(f"Preplot: {file['path']}")
if not db.file_in_db(file["path"]):
age = time.time() - os.path.getmtime(file["path"])
if age < file_min_age:
print("Skipping file because too new", file["path"])
continue
print("Importing")
try:
preplot = preplots.from_file(file)

View File

@@ -59,12 +59,12 @@ if __name__ == '__main__':
ntbp = False
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
@@ -82,9 +82,12 @@ if __name__ == '__main__':
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
if len(p111_records):
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
else:
print("No source records found in file")
else:
print("Already in DB")

View File

@@ -54,12 +54,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

View File

@@ -55,12 +55,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

View File

@@ -14,7 +14,7 @@ def detect_schema (conn):
if __name__ == '__main__':
import argparse
ap = argparse.ArgumentParser()
ap.add_argument("-s", "--schema", required=False, default=None, help="survey where to insert the event")
ap.add_argument("-t", "--tstamp", required=False, default=None, help="event timestamp")
@@ -30,16 +30,16 @@ if __name__ == '__main__':
schema = args["schema"]
else:
schema = detect_schema(db.conn)
if args["tstamp"]:
tstamp = args["tstamp"]
else:
tstamp = datetime.utcnow().isoformat()
message = " ".join(args["remarks"])
print("new event:", schema, tstamp, message)
if schema and tstamp and message:
db.set_survey(schema)
with db.conn.cursor() as cursor:

View File

@@ -12,7 +12,7 @@ from parse_fwr import parse_fwr
def parse_p190_header (string):
"""Parse a generic P1/90 header record.
Returns a dictionary of fields.
"""
names = [ "record_type", "header_type", "header_type_modifier", "description", "data" ]
@@ -27,7 +27,7 @@ def parse_p190_type1 (string):
"doy", "time", "spare2" ]
record = parse_fwr(string, [1, 12, 3, 1, 1, 1, 6, 10, 11, 9, 9, 6, 3, 6, 1])
return dict(zip(names, record))
def parse_p190_rcv_group (string):
"""Parse a P1/90 Type 1 receiver group record."""
names = [ "record_type",
@@ -37,7 +37,7 @@ def parse_p190_rcv_group (string):
"streamer_id" ]
record = parse_fwr(string, [1, 4, 9, 9, 4, 4, 9, 9, 4, 4, 9, 9, 4, 1])
return dict(zip(names, record))
def parse_line (string):
type = string[0]
if string[:3] == "EOF":
@@ -52,7 +52,7 @@ def parse_line (string):
def p190_type(type, records):
return [ r for r in records if r["record_type"] == type ]
def p190_header(code, records):
return [ h for h in p190_type("H", records) if h["header_type"]+h["header_type_modifier"] == code ]
@@ -86,15 +86,15 @@ def normalise_record(record):
# These are probably strings
elif "strip" in dir(record[key]):
record[key] = record[key].strip()
return record
def normalise(records):
for record in records:
normalise_record(record)
return records
def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records = []
with open(path) as fd:
@@ -102,10 +102,10 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
line = fd.readline()
while line:
cnt = cnt + 1
if line == "EOF":
break
record = parse_line(line)
if record is not None:
if only_records:
@@ -121,9 +121,9 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records.append(record)
line = fd.readline()
return records
def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
#print("tstamp", tstamp, type(tstamp))
if type(tstamp) is int:
@@ -161,16 +161,16 @@ def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
record["tstamp"] = ts
prev[object_id(record)] = doy
break
return recordset
def dms(value):
# 591544.61N
hemisphere = 1 if value[-1] in "NnEe" else -1
seconds = float(value[-6:-1])
minutes = int(value[-8:-6])
degrees = int(value[:-8])
return (degrees + minutes/60 + seconds/3600) * hemisphere
def tod(record):
@@ -183,7 +183,7 @@ def tod(record):
m = int(time[2:4])
s = float(time[4:])
return d*86400 + h*3600 + m*60 + s
def duration(record0, record1):
ts0 = tod(record0)
ts1 = tod(record1)
@@ -198,10 +198,10 @@ def azimuth(record0, record1):
x0, y0 = float(record0["easting"]), float(record0["northing"])
x1, y1 = float(record1["easting"]), float(record1["northing"])
return math.degrees(math.atan2(x1-x0, y1-y0)) % 360
def speed(record0, record1, knots=False):
scale = 3600/1852 if knots else 1
t0 = tod(record0)
t1 = tod(record1)
return (distance(record0, record1) / math.fabs(t1-t0)) * scale

View File

@@ -98,25 +98,30 @@ run $BINDIR/import_final_p190.py
print_log "Import SmartSource data"
run $BINDIR/import_smsrc.py
if [[ -z "$RUNNER_NOEXPORT" ]]; then
print_log "Export system data"
run $BINDIR/system_exports.py
fi
# if [[ -z "$RUNNER_NOEXPORT" ]]; then
# print_log "Export system data"
# run $BINDIR/system_exports.py
# fi
if [[ -n "$RUNNER_IMPORT" ]]; then
print_log "Import system data"
run $BINDIR/system_imports.py
fi
print_log "Export QC data"
run $BINDIR/human_exports_qc.py
# print_log "Export QC data"
# run $BINDIR/human_exports_qc.py
print_log "Export sequence data"
run $BINDIR/human_exports_seis.py
# print_log "Export sequence data"
# run $BINDIR/human_exports_seis.py
print_log "Process ASAQC queue"
# Run insecure in test mode:
# export NODE_TLS_REJECT_UNAUTHORIZED=0
run $DOUGAL_ROOT/lib/www/server/queues/asaqc/index.js
print_log "Run database housekeeping actions"
run $BINDIR/housekeep_database.py
rm "$LOCKFILE"
print_info "End run"

View File

@@ -39,7 +39,7 @@ exportables = {
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
@@ -50,7 +50,7 @@ def primary_key (table, cursor):
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()

View File

@@ -34,7 +34,7 @@ exportables = {
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
@@ -45,13 +45,13 @@ def primary_key (table, cursor):
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()
def import_table(fd, table, columns, cursor):
pk = [ r[0] for r in primary_key(table, cursor) ]
# Create temporary table to import into
temptable = "import_"+table
print("Creating temporary table", temptable)
@@ -61,29 +61,29 @@ def import_table(fd, table, columns, cursor):
AS SELECT {', '.join(pk + columns)} FROM {table}
WITH NO DATA;
"""
#print(qry)
cursor.execute(qry)
# Import into the temp table
print("Import data into temporary table")
cursor.copy_from(fd, temptable)
# Update the destination table
print("Updating destination table")
setcols = ", ".join([ f"{c} = t.{c}" for c in columns ])
wherecols = " AND ".join([ f"{table}.{c} = t.{c}" for c in pk ])
qry = f"""
UPDATE {table}
SET {setcols}
FROM {temptable} t
WHERE {wherecols};
"""
#print(qry)
cursor.execute(qry)
if __name__ == '__main__':
@@ -111,7 +111,7 @@ if __name__ == '__main__':
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
db.conn.commit()
print("Reading surveys")
@@ -130,7 +130,7 @@ if __name__ == '__main__':
columns = exportables["survey"][table]
path = os.path.join(pathPrefix, "-"+table)
print(" ←← ", path, " →→ ", table, columns)
try:
with open(path, "rb") as fd:
if columns is not None:
@@ -143,7 +143,7 @@ if __name__ == '__main__':
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
# If we don't commit the data does not actually get copied
db.conn.commit()

View File

@@ -19,3 +19,124 @@ Created with:
```bash
SCHEMA_NAME=survey_X EPSG_CODE=XXXXX $DOUGAL_ROOT/sbin/dump_schema.sh
```
## To create a new Dougal database
Ensure that the following packages are installed:
* `postgresql*-postgis-utils`
* `postgresql*-postgis`
* `postgresql*-contrib` # For B-trees
```bash
psql -U postgres <./database-template.sql
psql -U postgres <./database-version.sql
```
---
# Upgrading PostgreSQL
The following is based on https://en.opensuse.org/SDB:PostgreSQL#Upgrading_major_PostgreSQL_version
```bash
# The following bash code should be checked and executed
# line for line whenever you do an upgrade. The example
# shows the upgrade process from an original installation
# of version 12 up to version 14.
# install the new server as well as the required postgresql-contrib packages:
zypper in postgresql14-server postgresql14-contrib postgresql12-contrib
# If not yet done, copy the configuration create a new PostgreSQL configuration directory...
mkdir /etc/postgresql
# and copy the original file to this global directory
cd /srv/pgsql/data
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do cp -a $i /etc/postgresql/$i ; done
# Now create a new data-directory and initialize it for usage with the new server
install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
cd /srv/pgsql/data14
sudo -u postgres /usr/lib/postgresql14/bin/initdb .
# replace the newly generated files by a symlink to the global files.
# After doing so, you may check the difference of the created backup files and
# the files from the former installation
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do old $i ; ln -s /etc/postgresql/$i .; done
# Copy over special thesaurus files if some exists.
#cp -a /usr/share/postgresql12/tsearch_data/my_thesaurus_german.ths /usr/share/postgresql14/tsearch_data/
# Now it's time to disable the service...
systemctl stop postgresql.service
# And to start the migration. Please ensure, the directories fit to your upgrade path
sudo -u postgres /usr/lib/postgresql14/bin/pg_upgrade --link \
--old-bindir="/usr/lib/postgresql12/bin" \
--new-bindir="/usr/lib/postgresql14/bin" \
--old-datadir="/srv/pgsql/data/" \
--new-datadir="/srv/pgsql/data14/"
# NOTE: If getting the following error:
# lc_collate values for database "postgres" do not match: old "en_US.UTF-8", new "C"
# then:
# cd ..
# rm -rf /srv/pgsql/data14
# install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
# cd /srv/pgsql/data14
# sudo -u postgres /usr/lib/postgresql14/bin/initdb --locale=en_US.UTF-8 .
#
# and repeat the migration command
# After successfully migrating the data...
cd ..
# if not already symlinked move the old data to a versioned directory matching
# your old installation...
mv data data12
# and set a symlink to the new data directory
ln -sf data14/ data
# Now start the new service
systemctl start postgresql.service
# If everything has been sucessful, you should uninstall old packages...
#zypper rm -u postgresql12 postgresql13
# and remove old data directories
#rm -rf /srv/pgsql/data_OLD_POSTGRES_VERSION_NUMBER
# For good measure:
sudo -u postgres /usr/lib/postgresql14/bin/vacuumdb --all --analyze-in-stages
# If update_extensions.sql exists, apply it.
```
# Restoring from backup
## Whole database
Ensure that nothing is connected to the database.
```bash
psql -U postgres --dbname postgres <<EOF
-- Database: dougal
DROP DATABASE IF EXISTS dougal;
CREATE DATABASE dougal
WITH
OWNER = postgres
ENCODING = 'UTF8'
LC_COLLATE = 'en_GB.UTF-8'
LC_CTYPE = 'en_GB.UTF-8'
TABLESPACE = pg_default
CONNECTION LIMIT = -1;
ALTER DATABASE dougal
SET search_path TO "$user", public, topology;
EOF
# Adjust --jobs according to host machine
pg_restore -U postgres --dbname dougal --clean --if-exists --jobs 32 /path/to/backup
```

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 12.4
-- Dumped by pg_dump version 12.4
-- Dumped from database version 14.2
-- Dumped by pg_dump version 14.2
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -20,7 +20,7 @@ SET row_security = off;
-- Name: dougal; Type: DATABASE; Schema: -; Owner: postgres
--
CREATE DATABASE dougal WITH TEMPLATE = template0 ENCODING = 'UTF8' LC_COLLATE = 'C' LC_CTYPE = 'en_GB.UTF-8';
CREATE DATABASE dougal WITH TEMPLATE = template0 ENCODING = 'UTF8' LOCALE = 'en_GB.UTF-8';
ALTER DATABASE dougal OWNER TO postgres;
@@ -102,20 +102,6 @@ CREATE EXTENSION IF NOT EXISTS postgis WITH SCHEMA public;
COMMENT ON EXTENSION postgis IS 'PostGIS geometry, geography, and raster spatial types and functions';
--
-- Name: postgis_raster; Type: EXTENSION; Schema: -; Owner: -
--
CREATE EXTENSION IF NOT EXISTS postgis_raster WITH SCHEMA public;
--
-- Name: EXTENSION postgis_raster; Type: COMMENT; Schema: -; Owner:
--
COMMENT ON EXTENSION postgis_raster IS 'PostGIS raster types and functions';
--
-- Name: postgis_sfcgal; Type: EXTENSION; Schema: -; Owner: -
--
@@ -144,6 +130,48 @@ CREATE EXTENSION IF NOT EXISTS postgis_topology WITH SCHEMA topology;
COMMENT ON EXTENSION postgis_topology IS 'PostGIS topology spatial types and functions';
--
-- Name: queue_item_status; Type: TYPE; Schema: public; Owner: postgres
--
CREATE TYPE public.queue_item_status AS ENUM (
'queued',
'cancelled',
'failed',
'sent'
);
ALTER TYPE public.queue_item_status OWNER TO postgres;
--
-- Name: geometry_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
geometry,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) IS 'Get geometry from timestamp';
--
-- Name: notify(); Type: FUNCTION; Schema: public; Owner: postgres
--
@@ -182,23 +210,110 @@ $$;
ALTER FUNCTION public.notify() OWNER TO postgres;
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
--
-- Name: set_survey(text); Type: PROCEDURE; Schema: public; Owner: postgres
--
CREATE PROCEDURE public.set_survey(project_id text)
CREATE PROCEDURE public.set_survey(IN project_id text)
LANGUAGE sql
AS $$
SELECT set_config('search_path', (SELECT schema||',public' FROM public.projects WHERE pid = lower(project_id)), false);
$$;
ALTER PROCEDURE public.set_survey(project_id text) OWNER TO postgres;
ALTER PROCEDURE public.set_survey(IN project_id text) OWNER TO postgres;
--
-- Name: update_timestamp(); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.update_timestamp() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
IF NEW.updated_on IS NOT NULL THEN
NEW.updated_on := current_timestamp;
END IF;
RETURN NEW;
EXCEPTION
WHEN undefined_column THEN RETURN NEW;
END;
$$;
ALTER FUNCTION public.update_timestamp() OWNER TO postgres;
SET default_tablespace = '';
SET default_table_access_method = heap;
--
-- Name: info; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.info (
key text NOT NULL,
value jsonb
);
ALTER TABLE public.info OWNER TO postgres;
--
-- Name: projects; Type: TABLE; Schema: public; Owner: postgres
--
@@ -213,6 +328,46 @@ CREATE TABLE public.projects (
ALTER TABLE public.projects OWNER TO postgres;
--
-- Name: queue_items; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.queue_items (
item_id integer NOT NULL,
status public.queue_item_status DEFAULT 'queued'::public.queue_item_status NOT NULL,
payload jsonb NOT NULL,
results jsonb DEFAULT '{}'::jsonb NOT NULL,
created_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
updated_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
not_before timestamp with time zone DEFAULT '1970-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
parent_id integer
);
ALTER TABLE public.queue_items OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE; Schema: public; Owner: postgres
--
CREATE SEQUENCE public.queue_items_item_id_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER TABLE public.queue_items_item_id_seq OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: postgres
--
ALTER SEQUENCE public.queue_items_item_id_seq OWNED BY public.queue_items.item_id;
--
-- Name: real_time_inputs; Type: TABLE; Schema: public; Owner: postgres
--
@@ -227,16 +382,19 @@ CREATE TABLE public.real_time_inputs (
ALTER TABLE public.real_time_inputs OWNER TO postgres;
--
-- Name: info; Type: TABLE; Schema: public; Owner: postgres
-- Name: queue_items item_id; Type: DEFAULT; Schema: public; Owner: postgres
--
CREATE TABLE public.info (
key text NOT NULL,
value jsonb
);
ALTER TABLE ONLY public.queue_items ALTER COLUMN item_id SET DEFAULT nextval('public.queue_items_item_id_seq'::regclass);
ALTER TABLE public.info OWNER TO postgres;
--
-- Name: info info_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.info
ADD CONSTRAINT info_pkey PRIMARY KEY (key);
--
-- Name: projects projects_name_key; Type: CONSTRAINT; Schema: public; Owner: postgres
@@ -262,15 +420,20 @@ ALTER TABLE ONLY public.projects
ADD CONSTRAINT projects_schema_key UNIQUE (schema);
--
-- Name: info info_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
-- Name: queue_items queue_items_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.info
ADD CONSTRAINT info_pkey PRIMARY KEY (key);
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_pkey PRIMARY KEY (item_id);
--
-- Name: meta_tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
--
CREATE INDEX meta_tstamp_idx ON public.real_time_inputs USING btree (((meta ->> 'tstamp'::text)) DESC);
--
-- Name: tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
@@ -279,6 +442,13 @@ ALTER TABLE ONLY public.info
CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
--
-- Name: info info_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
-- Name: projects projects_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -286,6 +456,20 @@ CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects FOR EACH ROW EXECUTE FUNCTION public.notify('project');
--
-- Name: queue_items queue_items_tg0; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg0 BEFORE INSERT OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.update_timestamp();
--
-- Name: queue_items queue_items_tg1; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg1 AFTER INSERT OR DELETE OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.notify('queue_items');
--
-- Name: real_time_inputs real_time_inputs_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -294,10 +478,11 @@ CREATE TRIGGER real_time_inputs_tg AFTER INSERT ON public.real_time_inputs FOR E
--
-- Name: info info_tg; Type: TRIGGER; Schema: public; Owner: postgres
-- Name: queue_items queue_items_parent_id_fkey; Type: FK CONSTRAINT; Schema: public; Owner: postgres
--
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_parent_id_fkey FOREIGN KEY (parent_id) REFERENCES public.queue_items(item_id);
--

View File

@@ -0,0 +1,3 @@
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.4"}' WHERE public.info.key = 'version';

File diff suppressed because it is too large Load Diff

34
etc/db/upgrades/README.md Normal file
View File

@@ -0,0 +1,34 @@
# Database schema upgrades
When the database schema needs to be upgraded in order to provide new functionality, fix errors, etc., an upgrade script should be added to this directory.
The script can be SQL (preferred) or anything else (Bash, Python, …) in the event of complex upgrades.
The script itself should:
* document what the intended changes are;
* contain instructions on how to run it;
* make the user aware of any non-obvious side effects; and
* say if it is safe to run the script multiple times on the
* same schema / database.
## Naming
Script files should be named `upgrade-<index>-<commit-id-old>-<commit-id-new>-v<schema-version>.sql`, where:
* `<index>` is a correlative two-digit index. When reaching 99, existing files will be renamed to a three digit index (001-099) and new files will use three digits.
* `<commit-id-old>` is the ID of the Git commit that last introduced a schema change.
* `<commit-id-new>` is the ID of the first Git commit expecting the updated schema.
* `<schema-version>` is the version of the schema.
Note: the `<schema-version>` value should be updated with every change and it should be the same as reported by:
```sql
select value->>'db_schema' as db_schema from public.info where key = 'version';
```
If necessary, the wanted schema version must also be updated in `package.json`.
## Running
Schema upgrades are always run manually.

View File

@@ -0,0 +1,24 @@
-- Upgrade the database from commit 74b3de5c to commit 83be83e4.
--
-- NOTE: This upgrade only affects the `public` schema.
--
-- This inserts a database schema version into the database.
-- Note that we are not otherwise changing the schema, so older
-- server code will continue to run against this version.
--
-- ATTENTION!
--
-- This value should be incremented every time that the database
-- schema changes (either `public` or any of the survey schemas)
-- and is used by the server at start-up to detect if it is
-- running against a compatible schema version.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It can be applied multiple times without ill effect.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.1.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.1.0"}' WHERE public.info.key = 'version';

View File

@@ -0,0 +1,84 @@
-- Upgrade the database from commit 83be83e4 to 53ed096e.
--
-- New schema version: 0.2.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the file hashes to address issue #173.
-- The new hashes use size, modification time, creation time and the
-- first half of the MD5 hex digest of the file's absolute path.
--
-- It's a minor (rather than patch) version number increment because
-- changes to `bin/datastore.py` mean that the data is no longer
-- compatible with the hashing function.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE migrate_hashes (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Migrating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
EXECUTE format('UPDATE %I.files SET hash = array_to_string(array_append(trim_array(string_to_array(hash, '':''), 1), left(md5(path), 16)), '':'')', schema_name);
EXECUTE 'SET search_path TO public'; -- Back to the default search path for good measure
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE upgrade_10 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL migrate_hashes(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL upgrade_10();
CALL show_notice('Cleaning up');
DROP PROCEDURE migrate_hashes (schema_name text);
DROP PROCEDURE upgrade_10 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,189 @@
-- Add function to retrieve sequence/shotpoint from timestamps and vice-versa
--
-- New schema version: 0.2.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Two new functions are defined:
--
-- sequence_shot_from_tstamp(tstamp, [tolerance]) → sequence, point, delta
--
-- Returns a sequence + shotpoint if one falls within `tolerance` seconds
-- of `tstamp`. The tolerance may be omitted in which case it defaults to
-- three seconds. If multiple values match, it returns the closest in time.
--
-- tstamp_from_sequence_shot(sequence, point) → tstamp
--
-- Returns a timestamp given a sequence and point number.
--
-- NOTE: This last function must be called from a search path including a
-- project schema, as it accesses the raw_shots table.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION tstamp_from_sequence_shot(
IN s numeric,
IN p numeric,
OUT "ts" timestamptz)
AS $inner$
SELECT tstamp FROM raw_shots WHERE sequence = s AND point = p LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION tstamp_from_sequence_shot(numeric, numeric)
IS 'Get the timestamp of an existing shotpoint.';
CREATE OR REPLACE FUNCTION tstamp_interpolate(s numeric, p numeric) RETURNS timestamptz
AS $inner$
DECLARE
ts0 timestamptz;
ts1 timestamptz;
pt0 numeric;
pt1 numeric;
BEGIN
SELECT tstamp, point
INTO ts0, pt0
FROM raw_shots
WHERE sequence = s AND point < p
ORDER BY point DESC LIMIT 1;
SELECT tstamp, point
INTO ts1, pt1
FROM raw_shots
WHERE sequence = s AND point > p
ORDER BY point ASC LIMIT 1;
RETURN (ts1-ts0)/abs(pt1-pt0)*abs(p-pt0)+ts0;
END;
$inner$ LANGUAGE PLPGSQL;
COMMENT ON FUNCTION tstamp_interpolate(numeric, numeric)
IS 'Interpolate a timestamp given sequence and point values.
It will try to find the points immediately before and after in the sequence and interpolate into the gap, which may consist of multiple missed shots.
If called on an existing shotpoint it will return an interpolated timestamp as if the shotpoint did not exist, as opposed to returning its actual timestamp.
Returns NULL if it is not possible to interpolate.';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz, numeric)
IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz)
IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.1"}' WHERE public.info.key = 'version';
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,360 @@
-- Add new event log schema.
--
-- New schema version: 0.2.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
-- REQUIRES POSTGRESQL VERSION 14 OR NEWER
-- (Because of CREATE OR REPLACE TRIGGER)
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This is a redesign of the event logging mechanism. The old mechanism
-- relied on a distinction between sequence events (i.e., those which can
-- be associated to a shotpoint within a sequence), timed events (those
-- which occur outside any acquisition sequence) and so-called virtual
-- events (deduced from the data). It was inflexible and inefficient,
-- as most of the time we needed to merge those two types of events into
-- a single view.
--
-- The new mechanism:
-- - uses a single table
-- - accepts sequence event entries for shots or sequences which may not (yet)
-- exist. (https://gitlab.com/wgp/dougal/software/-/issues/170)
-- - keeps edit history (https://gitlab.com/wgp/dougal/software/-/issues/138)
-- - Keeps track of when an entry was made or subsequently edited.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect, as long
-- as the new tables did not previously exist. If they did, they will
-- be emptied before migrating the data.
--
-- WARNING: Applying this upgrade migrates the old event data. It does
-- NOT yet drop the old tables, which is handled in a separate script,
-- leaving the actions here technically reversible without having to
-- restore from backup.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE SEQUENCE IF NOT EXISTS event_log_uid_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE IF NOT EXISTS event_log_full (
-- uid is a unique id for each entry in the table,
-- including revisions of an existing entry.
uid integer NOT NULL PRIMARY KEY DEFAULT nextval('event_log_uid_seq'),
-- All revisions of an entry share the same id.
-- If inserting a new entry, id = uid.
id integer NOT NULL,
-- No default tstamp because, for instance, a user could
-- enter a sequence/point event referring to the future.
-- An external process should scan those at regular intervals
-- and populate the tstamp as needed.
tstamp timestamptz NULL,
sequence integer NULL,
point integer NULL,
remarks text NOT NULL DEFAULT '',
labels text[] NOT NULL DEFAULT ARRAY[]::text[],
-- TODO: Need a geometry column? Let us check performance as it is
-- and if needed either add a geometry column + spatial index.
meta jsonb NOT NULL DEFAULT '{}'::jsonb,
validity tstzrange NOT NULL CHECK (NOT isempty(validity)),
-- We accept either:
-- - Just a tstamp
-- - Just a sequence / point pair
-- - All three
-- We don't accept:
-- - A sequence without a point or vice-versa
-- - Nothing being provided
CHECK (
(tstamp IS NOT NULL AND sequence IS NOT NULL AND point IS NOT NULL) OR
(tstamp IS NOT NULL AND sequence IS NULL AND point IS NULL) OR
(tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL)
)
);
CREATE INDEX IF NOT EXISTS event_log_id ON event_log_full USING btree (id);
CREATE OR REPLACE FUNCTION event_log_full_insert() RETURNS TRIGGER AS $inner$
BEGIN
NEW.id := COALESCE(NEW.id, NEW.uid);
NEW.validity := tstzrange(current_timestamp, NULL);
NEW.meta = COALESCE(NEW.meta, '{}'::jsonb);
NEW.labels = COALESCE(NEW.labels, ARRAY[]::text[]);
IF cardinality(NEW.labels) > 0 THEN
-- Remove duplicates
SELECT array_agg(DISTINCT elements)
INTO NEW.labels
FROM (SELECT unnest(NEW.labels) AS elements) AS labels;
END IF;
RETURN NEW;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_full_insert_tg
BEFORE INSERT ON event_log_full
FOR EACH ROW EXECUTE FUNCTION event_log_full_insert();
-- The public.notify() trigger to alert clients that something has changed
CREATE OR REPLACE TRIGGER event_log_full_notify_tg
AFTER INSERT OR DELETE OR UPDATE
ON event_log_full FOR EACH ROW EXECUTE FUNCTION public.notify('event');
--
-- VIEW event_log
--
-- This is what is exposed to the user most of the time.
-- It shows the current version of records in the event_log_full
-- table.
--
-- The user applies edits to this table directly, which are
-- processed via triggers.
--
CREATE OR REPLACE VIEW event_log AS
SELECT
id, tstamp, sequence, point, remarks, labels, meta,
uid <> id AS has_edits,
lower(validity) AS modified_on
FROM event_log_full
WHERE validity @> current_timestamp;
CREATE OR REPLACE FUNCTION event_log_update() RETURNS TRIGGER AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (OLD.id, NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_tg
INSTEAD OF INSERT OR UPDATE OR DELETE ON event_log
FOR EACH ROW EXECUTE FUNCTION event_log_update();
-- NOTE
-- This is where we migrate the actual data
RAISE NOTICE 'Migrating schema %', schema_name;
-- We start by deleting any data that the new tables might
-- have had if they already existed.
DELETE FROM event_log_full;
-- We purposefully bypass event_log here, as the tables we're
-- migrating from only contain a single version of each event.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
-- This function used the superseded `events` view.
-- We need to drop it because we're changing the return type.
DROP FUNCTION IF EXISTS label_in_sequence (_sequence integer, _label text);
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS event_log
LANGUAGE sql
AS $inner$
SELECT * FROM event_log WHERE sequence = _sequence AND _label = ANY(labels);
$inner$;
-- This function used the superseded `events` view (and a strange logic).
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $inner$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event event_log%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
INSERT INTO event_log (sequence, point, remarks, labels, meta)
VALUES (
-- The sequence
_seq,
-- The shotpoint
_column_value,
-- Remark. Something like "FSP <linename>"
format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)),
-- Label
ARRAY[_label],
-- Meta. Something like {"auto" : {"FSP" : "final_line"}}
json_build_object('auto', json_build_object(_label, _tg_name))
);
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
END IF;
END IF;
END;
$inner$;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_12 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_12();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_12 ();
CALL show_notice('Updating db_schema version');
-- This is technically still compatible with 0.2.0 as we are only adding
-- some more tables and views but not yet dropping the old ones, which we
-- will do separately so that these scripts do not get too big.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Migrate events to new schema
--
-- New schema version: 0.3.0
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the data from the old event log tables to the new schema.
-- It is a *very* good idea to review the data manually after the migration
-- as issues with the logs that had gone unnoticed may become evident now.
--
-- WARNING: If data exists in the new event tables, IT WILL BE TRUNCATED.
--
-- Other than that, this migration is fairly benign as it does not modify
-- the old data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the new event tables while the transaction is active.
--
-- WARNING: This is a minor (not patch) version change, meaning that it requires
-- an upgrade and restart of the backend server.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
TRUNCATE event_log_full;
-- NOTE: meta->>'virtual' = TRUE means that the event was created algorithmically
-- and should not be user editable.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
-- We purposefully bypass event_log here
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
END
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,99 @@
-- Drop old event tables.
--
-- New schema version: 0.3.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This completes the migration from the old event logging mechanism by
-- DROPPING THE OLD DATABASE OBJECTS, MAKING THE MIGRATION IRREVERSIBLE,
-- other than by restoring from backup and manually transferring any new
-- data that may have been created in the meanwhile.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
DROP FUNCTION IF EXISTS
label_in_sequence(integer,text), reset_events_serials();
DROP VIEW IF EXISTS
events_midnight_shot, events_seq_timed, events_labels, "events";
DROP TABLE IF EXISTS
events_seq_labels, events_timed_labels, events_timed_seq, events_seq, events_timed;
DROP SEQUENCE IF EXISTS
events_seq_id_seq, events_timed_id_seq;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.1"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,136 @@
-- Fix project_summary view.
--
-- New schema version: 0.3.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This fixes a problem with the project_summary view. In its common table
-- expression, the view definition tried to search public.projects based on
-- the search path value with the following expression:
--
-- (current_setting('search_path'::text) ~~ (p.schema || '%'::text))
--
-- That is of course bound to fail as soon as the schema goes above `survey_9`
-- because `survey_10 LIKE ('survey_1' || '%')` is TRUE.
--
-- The new mechanism relies on splitting the search_path.
--
-- NOTE: The survey schema needs to be the leftmost element in search_path.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW project_summary AS
WITH fls AS (
SELECT avg((final_lines_summary.duration / ((final_lines_summary.num_points - 1))::double precision)) AS shooting_rate,
avg((final_lines_summary.length / date_part('epoch'::text, final_lines_summary.duration))) AS speed,
sum(final_lines_summary.duration) AS prod_duration,
sum(final_lines_summary.length) AS prod_distance
FROM final_lines_summary
), project AS (
SELECT p.pid,
p.name,
p.schema
FROM public.projects p
WHERE (split_part(current_setting('search_path'::text), ','::text, 1) = p.schema)
)
SELECT project.pid,
project.name,
project.schema,
( SELECT count(*) AS count
FROM preplot_lines
WHERE (preplot_lines.class = 'V'::bpchar)) AS lines,
ps.total,
ps.virgin,
ps.prime,
ps.other,
ps.ntba,
ps.remaining,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp
LIMIT 1) AS fsp,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp DESC
LIMIT 1) AS lsp,
( SELECT count(*) AS count
FROM raw_lines rl) AS seq_raw,
( SELECT count(*) AS count
FROM final_lines rl) AS seq_final,
fls.prod_duration,
fls.prod_distance,
fls.speed AS shooting_rate
FROM preplot_summary ps,
fls,
project;
ALTER TABLE project_summary OWNER TO postgres;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_15 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_15();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_15 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,169 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- The event_log_update() function that gets called when trying to update
-- the event_log view will not work if the caller does provide a timestamp
-- or sequence + point in the list of fields to be updated. See:
-- https://gitlab.com/wgp/dougal/software/-/issues/198
--
-- This fixes the problem by liberally using COALESCE() to merge the OLD
-- and NEW records.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_update() RETURNS trigger
LANGUAGE plpgsql
AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (
OLD.id,
COALESCE(NEW.tstamp, OLD.tstamp),
COALESCE(NEW.sequence, OLD.sequence),
COALESCE(NEW.point, OLD.point),
COALESCE(NEW.remarks, OLD.remarks),
COALESCE(NEW.labels, OLD.labels),
COALESCE(NEW.meta, OLD.meta)
);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$;
CREATE OR REPLACE TRIGGER event_log_tg INSTEAD OF INSERT OR DELETE OR UPDATE ON event_log FOR EACH ROW EXECUTE FUNCTION event_log_update();
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_16 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_16();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_16 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.3"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,163 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new procedure augment_event_data() which tries to
-- populate missing event_log data, namely timestamps and geometries.
--
-- To do this it also adds a function public.geometry_from_tstamp()
-- which, given a timestamp, tries to fetch a geometry from real_time_inputs.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE augment_event_data ()
LANGUAGE sql
AS $inner$
-- Populate the timestamp of sequence / point events
UPDATE event_log_full
SET tstamp = tstamp_from_sequence_shot(sequence, point)
WHERE
tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL;
-- Populate the geometry of sequence / point events for which
-- there is raw_shots data.
UPDATE event_log_full
SET meta = meta ||
jsonb_build_object(
'geometry',
(
SELECT st_transform(geometry, 4326)::jsonb
FROM raw_shots rs
WHERE rs.sequence = event_log_full.sequence AND rs.point = event_log_full.point
)
)
WHERE
sequence IS NOT NULL AND point IS NOT NULL AND
NOT meta ? 'geometry';
-- Populate the geometry of time-based events
UPDATE event_log_full e
SET
meta = meta || jsonb_build_object('geometry',
(SELECT st_transform(g.geometry, 4326)::jsonb
FROM geometry_from_tstamp(e.tstamp, 3) g))
WHERE
tstamp IS NOT NULL AND
sequence IS NULL AND point IS NULL AND
NOT meta ? 'geometry';
-- Get rid of null geometries
UPDATE event_log_full
SET
meta = meta - 'geometry'
WHERE
jsonb_typeof(meta->'geometry') = 'null';
-- Simplify the GeoJSON when the CRS is EPSG:4326
UPDATE event_log_full
SET
meta = meta #- '{geometry, crs}'
WHERE
meta->'geometry'->'crs'->'properties'->>'name' = 'EPSG:4326';
$inner$;
COMMENT ON PROCEDURE augment_event_data()
IS 'Populate missing timestamps and geometries in event_log_full';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_17 () AS $$
DECLARE
row RECORD;
BEGIN
CALL show_notice('Adding index to real_time_inputs.meta->tstamp');
CREATE INDEX IF NOT EXISTS meta_tstamp_idx
ON public.real_time_inputs
USING btree ((meta->>'tstamp') DESC);
CALL show_notice('Creating function geometry_from_tstamp');
CREATE OR REPLACE FUNCTION public.geometry_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "geometry" geometry,
OUT "delta" numeric)
AS $inner$
SELECT
geometry,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.geometry_from_tstamp(timestamptz, numeric)
IS 'Get geometry from timestamp';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_17();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_17 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.4"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -7,14 +7,20 @@
id: missing_shots
check: |
const sequence = currentItem;
const sp0 = Math.min(sequence.fsp, sequence.lsp);
const sp1 = Math.max(sequence.fsp, sequence.lsp);
const missing = preplots.filter(r => r.line == sequence.line &&
r.point >= sp0 && r.point <= sp1 &&
!sequence.shots.find(s => s.point == r.point)
);
let results;
if (sequence.missing_shots) {
results = {
shots: {}
}
const missing_shots = missingShotpoints.filter(i => !i.ntba);
for (const shot of missing_shots) {
results.shots[shot.point] = { remarks: "Missed shot", labels: [ "QC", "QCAcq" ] };
}
} else {
results = true;
}
missing.length == 0 || missing.map(r => `Missing shot: ${r.point}`).join("\n")
results;
-
name: "Gun QC"
disabled: false
@@ -25,15 +31,15 @@
iterate: "sequences"
id: seq_no_gun_data
check: |
const sequence = currentItem;
currentItem.has_smsrc_data || "Sequence has no gun data"
shotpoints.some(i => i.meta?.raw?.smsrc) || "Sequence has no gun data"
-
name: "Missing gun data"
id: missing_gun_data
ignoreAllFailed: true
check: |
sequences.some(s => s.sequence == currentItem.sequence && s.has_smsrc_data)
? (!!currentItem._("raw_meta.smsrc.guns") || "Missing gun data")
: true
!!currentItem._("raw_meta.smsrc.guns")
? true
: "Missing gun data"
-
name: "No fire"
@@ -41,8 +47,8 @@
check: |
const currentShot = currentItem;
const gunData = currentItem._("raw_meta.smsrc");
(gunData && gunData.num_nofire != 0)
? `Source ${gunData.src_number}: No fire (${gunData.num_nofire} guns)`
(gunData && gunData.guns && gunData.guns.length != gunData.num_active)
? `Source ${gunData.src_number}: No fire (${gunData.guns.length - gunData.num_active} guns)`
: true;
-
@@ -56,8 +62,8 @@
.guns
.filter(gun => ((gun[2] == gunData.src_number) && (gun[pressure]/parameters.gunPressureNominal - 1) > parameters.gunPressureToleranceRatio))
.map(gun =>
`source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, pressure: ${gun[pressure]} / ${parameters.gunPressureNominal} = ${(Math.abs(gunData.manifold/parameters.gunPressureNominal - 1)*100).toFixed(1)}% > ${(parameters.gunPressureToleranceRatio*100).toFixed(1)}%`
);
`source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, pressure: ${gun[pressure]} / ${parameters.gunPressureNominal} = ${(Math.abs(gun[pressure]/parameters.gunPressureNominal - 1)*100).toFixed(2)}% > ${(parameters.gunPressureToleranceRatio*100).toFixed(2)}%`
).join(" \n");
results && results.length
? results
: true
@@ -201,7 +207,7 @@
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_i) <= parameters.crosslineError
|| `Crossline error (${currentShot.type}): ${currentShot.error_i.toFixed(1)} > ${parameters.crosslineError}`
|| `Crossline error (${currentShot.type}): ${currentShot.error_i.toFixed(2)} > ${parameters.crosslineError}`
-
name: "Inline"
@@ -209,7 +215,7 @@
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_j) <= parameters.inlineError
|| `Inline error (${currentShot.type}): ${currentShot.error_j.toFixed(1)} > ${parameters.inlineError}`
|| `Inline error (${currentShot.type}): ${currentShot.error_j.toFixed(2)} > ${parameters.inlineError}`
-
name: "Centre of source preplot deviation (moving average)"
@@ -222,11 +228,16 @@
id: crossline_average
check: |
const currentSequence = currentItem;
const i_err = currentSequence.shots.filter(s => s.error_i != null).map(a => a.error_i);
//const i_err = shotpoints.filter(s => s.error_i != null).map(a => a.error_i);
const i_err = shotpoints.map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[0]
)
.filter(i => !isNaN(i));
if (i_err.length) {
const avg = i_err.reduce( (a, b) => a+b)/i_err.length;
avg <= parameters.crosslineErrorAverage ||
`Average crossline error: ${avg.toFixed(1)} > ${parameters.crosslineErrorAverage}`
`Average crossline error: ${avg.toFixed(2)} > ${parameters.crosslineErrorAverage}`
} else {
`Sequence ${currentSequence.sequence} has no shots within preplot`
}
@@ -239,16 +250,27 @@
check: |
const currentSequence = currentItem;
const n = parameters.inlineErrorRunningAverageShots; // For brevity
const results = currentSequence.shots.slice(n/2, -n/2).map( (shot, index) => {
const shots = currentSequence.shots.slice(index, index+n).map(i => i.error_j).filter(i => i !== null);
const results = shotpoints.slice(n/2, -n/2).map( (shot, index) => {
const shots = shotpoints.slice(index, index+n).map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[1]
).filter(i => i !== null);
if (!shots.length) {
// We are outside the preplot
// Nothing to see here, move along
return true;
}
const mean = shots.reduce( (a, b) => a+b ) / shots.length;
return Math.abs(mean) <= parameters.inlineErrorRunningAverageValue ||
`Running average inline error: shot ${shot.point}, ${mean.toFixed(1)} > ${parameters.inlineErrorRunningAverageValue}`
return Math.abs(mean) <= parameters.inlineErrorRunningAverageValue || [
shot.point,
{
remarks: `Running average inline error: ${mean.toFixed(2)} > ${parameters.inlineErrorRunningAverageValue}`,
labels: [ "QC", "QCNav" ]
}
]
}).filter(i => i !== true);
results.length == 0 || results.join("\n");
results.length == 0 || {
remarks: "Sequence exceeds inline error running average limit",
shots: Object.fromEntries(results)
}

File diff suppressed because it is too large Load Diff

View File

@@ -31,7 +31,7 @@
"@vue/cli-plugin-router": "~4.4.0",
"@vue/cli-plugin-vuex": "~4.4.0",
"@vue/cli-service": "^4.5.13",
"sass": "^1.26.11",
"sass": "~1.32",
"sass-loader": "^8.0.0",
"stylus": "^0.54.8",
"stylus-loader": "^3.0.2",

View File

@@ -26,7 +26,7 @@
<style lang="stylus">
@import '../node_modules/typeface-roboto/index.css'
@import '../node_modules/@mdi/font/css/materialdesignicons.css'
.markdown.v-textarea textarea
font-family monospace
line-height 1.1 !important
@@ -66,7 +66,7 @@ export default {
snackText (newVal) {
this.snack = !!newVal;
},
snack (newVal) {
// When the snack is hidden (one way or another), clear
// the text so that if we receive the same message again

View File

@@ -1,6 +1,7 @@
<template>
<v-menu
v-model="show"
:value="value"
@input="(e) => $emit('input', e)"
:position-x="absolute && x || undefined"
:position-y="absolute && y || undefined"
:absolute="absolute"
@@ -20,6 +21,7 @@
<dougal-context-menu v-if="item.items"
:value="showSubmenu"
:items="item.items"
:labels="labels.concat(item.labels||[])"
@input="selected"
submenu>
<template v-slot:activator="{ on, attrs }">
@@ -55,14 +57,14 @@ export default {
props: {
value: { type: [ MouseEvent, Object, Boolean ] },
labels: { type: [ Array ], default: () => [] },
absolute: { type: Boolean, default: false },
submenu: { type: Boolean, default: false },
items: { type: Array, default: [] }
items: { type: Array, default: () => [] }
},
data () {
return {
show: false,
x: 0,
y: 0,
showSubmenu: false
@@ -97,7 +99,12 @@ export default {
selected (item) {
this.show = false;
this.$emit('input', item);
if (typeof item === 'object' && item !== null) {
const labels = this.labels.concat(item.labels??[]);
this.$emit('input', {...item, labels});
} else {
this.$emit('input', item);
}
}
}

View File

@@ -1,406 +0,0 @@
<template>
<v-dialog
v-model="show"
max-width="600px"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title>
<span class="headline">{{ formTitle }}</span>
</v-card-title>
<v-card-text>
<v-container>
<v-row>
<v-col>
<v-textarea
v-model="remarks"
label="Description"
rows="1"
auto-grow
clearable
autofocus
filled
:hint="presetRemarks ? 'Enter your own comment or select a preset one from the menu on the left' : 'Enter a comment'"
@keyup="handleKeys"
>
<template v-slot:prepend v-if="presetRemarks">
<v-icon
title="Select predefined comments"
color="primary"
@click="showRemarksMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-textarea>
<dougal-context-menu
:value="remarksMenu"
@input="addRemark"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-autocomplete
ref="labels"
v-model="labels"
:items="Object.keys(allowedLabels)"
chips
deletable-chips
multiple
label="Labels"
@input="labelSearch=null; $refs.labels.isMenuActive=false"
:search-input.sync="labelSearch"
>
<template v-slot:selection="data">
<v-chip
v-bind="data.attrs"
:input-value="data.selected"
close
@click="data.select"
@click:close="remove(data.item)"
:color="allowedLabels[data.item].view.colour"
:title="allowedLabels[data.item].view.description"
>{{data.item}}</v-chip>
</template>
<template v-slot:prepend v-if="presetLabels">
<v-icon
title="Select labels"
color="primary"
@click="showLabelsMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-switch label="Change time" v-model="timeInput" :disabled="shotInput"></v-switch>
</v-col>
<v-col>
<v-switch label="Enter shotpoint" v-model="shotInput" :disabled="timeInput"></v-switch>
</v-col>
</v-row>
<v-row dense>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsTime" type="time" step="1" label="Time">
</v-text-field>
</v-col>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsDate" type="date" label="Date">
</v-text-field>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-autocomplete
:items="sequenceList"
v-model="sequence"
label="Sequence"
></v-autocomplete>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-text-field v-model="point" type="number" label="Shot">
</v-text-field>
</v-col>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-spacer></v-spacer>
<v-btn color="blue darken-1" text @click="close">Cancel</v-btn>
<v-btn color="blue darken-1" text @click="save" :disabled="!isValid">Save</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
import { withParentProps } from '@/lib/utils'
export default {
name: 'DougalEventEditDialog',
components: {
DougalContextMenu
},
props: {
value: Boolean,
allowedLabels: { type: Object, default: () => {} },
sequences: { type: Object, default: null },
defaultTimestamp: { type: [ Date, String, Number, Function ], default: null },
defaultSequence: { type: Number, default: null },
defaultShotpoint: { type: Number, default: null },
eventMode: { type: String, default: "timed" },
presetRemarks: { type: [ Object, Array ], default: null },
presetLabels: { type: [ Object, Array ], default: null }
},
data () {
const tsNow = new Date;
return {
show: false,
tsDate: tsNow.toISOString().substring(0, 10),
tsTime: tsNow.toISOString().substring(11, 19),
sequenceData: null,
sequence: null,
point: null,
remarks: "",
labels: [],
labelSearch: null,
timer: null,
timeInput: false,
shotInput: false,
remarksMenu: false,
menuX: 0,
menuY: 0,
}
},
computed: {
eventType () {
return this.timeInput
? "timed"
: this.shotInput
? "seq"
: this.eventMode;
},
formTitle () {
if (this.eventType == "seq") {
return `New event at shotpoint ${this.shot.point}`;
} else {
return "New event at time "+this.tstamp.toISOString().replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2");
}
},
defaultTimestampAsDate () {
if (this.defaultTimestamp instanceof Date) {
return this.defaultTimestamp;
} else if (typeof this.defaultTimestamp == 'string') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'number') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'function') {
return new Date(this.defaultTimestamp());
}
},
tstamp () {
return this.timeInput
? new Date(this.tsDate+"T"+this.tsTime+"Z")
: this.defaultTimestampAsDate || new Date();
},
shot () {
return this.shotInput
? { sequence: this.sequence, point: Number(this.point) }
: { sequence: this.defaultSequence, point: this.defaultShotpoint };
},
isTimedEvent () {
return Boolean((this.timeInput && this.tstamp) ||
(this.defaultTimestampAsDate && !this.shotInput));
},
isShotEvent () {
return Boolean((this.shotInput && this.shot.sequence && this.shot.point) ||
(this.defaultSequence && this.defaultShotpoint && !this.timeInput));
},
isValid () {
if (this.isTimedEvent) {
return !isNaN(this.tstamp) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
if (this.isShotEvent) {
return Number(this.sequence) && Number(this.point) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
return false;
},
sequenceList () {
const seq = this.sequences || this.sequenceData || [];
return seq.map(s => s.sequence).sort((a,b) => b-a);
},
eventData () {
if (!this.isValid) {
return null;
}
const data = {}
data.remarks = this.remarks.trim();
if (this.labels) {
data.labels = this.labels;
}
if (this.isTimedEvent) {
data.tstamp = this.tstamp;
} else if (this.isShotEvent) {
data.sequence = this.shot.sequence;
data.point = this.shot.point;
}
return data;
}
},
watch: {
async show (value) {
this.$emit('input', value);
if (value) {
this.updateTimeFields();
await this.updateSequences();
this.sequence = this.defaultSequence;
this.point = this.defaultShotpoint;
this.shotInput = this.eventMode == "seq";
}
},
value (v) {
if (v != this.show) {
this.show = v;
}
}
},
methods: {
clear () {
this.timeInput = false;
this.shotInput = false;
this.remarks = "";
this.labels = [];
},
close () {
this.show = false;
this.clear();
},
save () {
this.$emit('save', this.eventData);
this.close();
},
remove (item) {
this.labels.splice(this.labels.indexOf(item), 1);
},
updateTimeFields () {
const tsNow = new Date;
this.tsDate = tsNow.toISOString().substring(0, 10);
this.tsTime = tsNow.toISOString().substring(11, 19);
},
async updateSequences () {
if (this.sequences == null) {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequenceData = await this.api([url]) || null
}
this.sequence = this.sequenceList.reduce( (a, b) => Math.max(a, b) );
},
showRemarksMenu (e) {
this.remarksMenu = e;
},
addRemark (item) {
const p = withParentProps(item, this.presetRemarks, "items", "labels");
item = p[1]
? Object.assign({labels: p[1]}, item)
: item;
if (item.text) {
if (this.remarks === null) {
this.remarks = "";
}
if (this.remarks.length && this.remarks[this.remarks.length-1] != "\n") {
this.remarks += "\n";
}
this.remarks += item.text;
}
if (item.labels) {
const unique = new Set();
this.labels.concat(item.labels).forEach(l => unique.add(l));
this.labels = [...unique];
}
},
handleKeys (e) {
if (e.ctrlKey && !e.altKey && !e.shiftKey && !e.metaKey && e.keyCode == 13) {
// Ctrl+Enter
if (this.isValid) {
this.save();
}
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,240 @@
<template>
<v-dialog
v-model="dialog"
style="z-index:2020;"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="hover"
icon
small
title="This entry has edits. Click to view history."
:disabled="disabled"
v-on="on"
>
<v-icon small>mdi-playlist-edit</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title class="headline">
Event history
</v-card-title>
<v-card-text>
<p>Event ID: {{ id }}</p>
<v-data-table
dense
class="small"
:headers="headers"
:items="rows"
item-key="uid"
sort-by="uid"
:sort-desc="true"
:loading="loading"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
>
<template v-slot:item.tstamp="{value}">
<span style="white-space:nowrap;" v-if="value">
{{ value.replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2") }}
</span>
</template>
<template v-slot:item.remarks="{item}">
<template>
<div>
<span v-if="item.labels.length">
<v-chip v-for="label in item.labels"
class="mr-1 px-2 underline-on-hover"
x-small
:color="labels[label] && labels[label].view.colour"
:title="labels[label] && labels[label].view.description"
:key="label"
:href="$route.path+'?label='+encodeURIComponent(label)"
>{{label}}</v-chip>
</span>
<span v-html="$options.filters.markdownInline(item.remarks)">
</span>
</div>
</template>
</template>
<template v-slot:item.valid_from="{item}">
<span style="white-space:nowrap;" v-if="item.validity[1]">
{{ item.validity[1].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<template v-slot:item.valid_until="{item}">
<span style="white-space:nowrap;" v-if="item.validity[2]">
{{ item.validity[2].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<!-- Actions column -->
<template v-slot:item.actions="{ item }">
<div style="white-space:nowrap;">
<!-- NOTE Kind of cheating here by assuming that there will be
no items with *future* validity. -->
<template v-if="item.validity[2]">
<v-btn v-if="!item.meta.readonly"
class="hover"
icon
small
title="Restore"
:disabled="loading"
@click=restoreEvent(item)
>
<v-icon small>mdi-history</v-icon>
</v-btn>
<v-btn v-else
class="hover off"
icon
small
title="This event is read-only"
:disabled="loading"
>
<v-icon small>mdi-lock-reset</v-icon>
</v-btn>
</template>
</div>
</template>
</v-data-table>
</v-card-text>
</v-card>
</v-dialog>
</template>
<style scoped>
.hover {
opacity: 0.4;
}
.hover:hover {
opacity: 1;
}
.hover.off:hover {
opacity: 0.4;
}
.small >>> td, .small >>> th {
font-size: 85% !important;
}
</style>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: 'DougalEventEditHistory',
props: {
id: { type: Number },
disabled: { type: Boolean, default: false },
labels: { default: {} }
},
data () {
return {
dialog: false,
rows: [],
headers: [
{
value: "tstamp",
text: "Timestamp",
width: "20ex"
},
{
value: "sequence",
text: "Sequence",
align: "end",
width: "10ex"
},
{
value: "point",
text: "Shotpoint",
align: "end",
width: "10ex"
},
{
value: "remarks",
text: "Text",
width: "100%"
},
{
value: "valid_from",
text: "Valid From"
},
{
value: "valid_until",
text: "Valid Until"
},
{
value: "actions",
text: "Actions",
sortable: false
}
]
};
},
computed: {
...mapGetters(['loading', 'serverEvent'])
},
watch: {
dialog (val) {
if (!val) {
this.rows = [];
} else {
this.getEventHistory();
}
},
async serverEvent (event) {
if (event.channel == "event" &&
(event.payload?.new?.id ?? event.payload?.old?.id) == this.id) {
// The event that we're viewing has been refreshed (possibly by us)
this.getEventHistory();
}
}
},
methods: {
async getEventHistory () {
const url = `/project/${this.$route.params.project}/event/${this.id}`;
this.rows = (await this.api([url]) || []).map(row => {
row.valid_from = row.validity[1] ?? -Infinity;
row.valid_until = row.validity[2] ?? +Infinity;
return row;
});
},
async restoreEvent (item) {
if (item.id) {
const url = `/project/${this.$route.params.project}/event/${item.id}`;
await this.api([url, {
method: "PUT",
body: item // NOTE Sending extra attributes in the body may cause trouble down the line
}]);
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,208 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event labels</v-toolbar-title>
<v-spacer></v-spacer>
<v-btn
icon
@click="$refs.search.focus()"
>
<v-icon>mdi-magnify</v-icon>
</v-btn>
</v-toolbar>
<v-container class="py-0">
<v-row
align="center"
justify="start"
>
<v-col
v-for="(item, i) in selection"
:key="item.text"
class="shrink"
>
<v-chip
:disabled="loading"
small
:color="item.colour"
:title="item.title"
close
@click:close="selection.splice(i, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</v-col>
<v-col v-if="!allSelected"
cols="12"
>
<v-text-field
ref="search"
v-model="search"
full-width
hide-details
label="Search"
single-line
></v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider v-if="!allSelected"></v-divider>
<v-list dense style="max-height:600px;overflow-y:auto;">
<template v-for="item in categories">
<v-list-item v-if="!selection.find(i => i.text == item.text)"
dense
:key="item.text"
:disabled="loading"
@click="selection.push(item)"
>
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</v-list-item>
</template>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn
:loading="loading"
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!dirty"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
export default {
name: 'DougalEventEditLabels',
props: {
value: { default: false },
labels: { type: Object },
selected: {type: Array },
loading: { type: Boolean, default: false }
},
data: () => ({
dialog: false,
search: '',
selection: [],
}),
computed: {
allSelected () {
return this.selection.length === this.items.length
},
dirty () {
// Checks if the arrays have the same elements
return !this.selection.every(i => this.selected.includes(i.text)) ||
!this.selected.every(i => this.selection.find(j => j.text==i));
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.labels).map(this.labelToItem);
}
},
watch: {
value () {
this.dialog = this.value;
if (this.dialog) {
this.$nextTick(() => this.$refs.search?.focus());
}
},
selected () {
this.selection = this.selected.map(this.labelToItem)
},
selection () {
this.search = '';
this.$refs.search?.focus();
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.labels[k].view?.icon,
colour: this.labels[k].view?.colour,
title: this.labels[k].view?.description
};
},
close () {
this.selection = this.selected.map(this.labelToItem)
this.$emit("input", false);
},
save () {
this.$emit("selectionChanged", {labels: this.selection.map(i => i.text)});
this.$emit("input", false);
},
},
}
</script>

View File

@@ -0,0 +1,679 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
@click="(e) => $emit('new', e)"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event</v-toolbar-title>
<v-spacer></v-spacer>
</v-toolbar>
<v-container class="py-0">
<v-row dense>
<v-col>
<v-menu
v-model="dateMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
readonly
v-bind="attrs"
v-on="on"
@change="updateAncillaryData"
></v-text-field>
</template>
<v-date-picker
v-model="tsDate"
@input="dateMenu = false"
></v-date-picker>
</v-menu>
</v-col>
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
type="time"
step="1"
@change="updateAncillaryData"
>
<template v-slot:prepend>
<v-menu
v-model="timeMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-icon v-on="on" v-bind="attrs">mdi-clock-outline</v-icon>
</template>
<v-time-picker
v-model="tsTime"
format="24hr"
></v-time-picker>
</v-menu>
</template>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entrySequence"
type="number"
min="1"
step="1"
label="Sequence"
prepend-icon="mdi-format-list-bulleted"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryPoint"
type="number"
min="1"
step="1"
label="Point"
prepend-icon="mdi-map-marker-circle"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-combobox
ref="remarks"
v-model="entryRemarks"
:disabled="loading"
:search-input.sync="entryRemarksInput"
:items="remarksAvailable"
:filter="searchRemarks"
item-text="text"
return-object
label="Remarks"
prepend-icon="mdi-text-box-outline"
append-outer-icon="mdi-magnify"
@click:append-outer="(e) => remarksMenu = e"
></v-combobox>
<dougal-context-menu
:value="remarksMenu"
@input="handleRemarksMenu"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-autocomplete
ref="labels"
v-model="entryLabels"
:items="categories"
multiple
menu-props="closeOnClick, closeOnContentClick"
attach
chips
label="Labels"
prepend-icon="mdi-tag-multiple"
append-outer-icon="mdi-magnify"
@click:append-outer="() => $refs.labels.focus()"
>
<template v-slot:selection="{ item, index, select, selected, disabled }">
<v-chip
:disabled="loading"
small
light
:color="item.colour"
:title="item.title"
close
@click:close="entryLabels.splice(index, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</template>
<template v-slot:item="{ item }">
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
light
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entryLatitude"
label="Latitude"
prepend-icon="φ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false/*TODO*/"
title="Click to set position"
@click="1==1/*TODO*/"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
disabled
title="No GNSS available"
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryLongitude"
label="Longitude"
prepend-icon="λ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false"
title="Click to set position"
@click="getPosition"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
title="No GNSS available"
disabled
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider></v-divider>
<v-card-actions>
<v-btn
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
/* https://github.com/vuetifyjs/vuetify/issues/471 */
.v-dialog {
overflow-y: initial;
}
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
function flattenRemarks(items, keywords=[], labels=[]) {
const result = [];
if (items) {
for (const item of items) {
if (!item.items) {
result.push({
text: item.text,
labels: labels.concat(item.labels??[]),
keywords
})
} else {
const k = [...keywords, item.text];
const l = [...labels, ...(item.labels??[])];
result.push(...flattenRemarks(item.items, k, l))
}
}
}
return result;
}
/** Compare two arrays
*
* @a a First array
* @a b Second array
* @a cbB Callback to transform elements of `b`
*
* @return true if the sets are distinct, false otherwise
*
* Note that this will not work with object or other complex
* elements unless the array members are the same object (as
* opposed to merely identical).
*/
function distinctSets(a, b, cbB = (i) => i) {
return !a.every(i => b.map(cbB).includes(i)) ||
!b.map(cbB).every(i => a.find(j => j==i));
}
export default {
name: 'DougalEventEdit',
components: {
DougalContextMenu
},
props: {
value: { default: false },
availableLabels: { type: Object, default: () => ({}) },
presetRemarks: { type: Array, default: () => [] },
id: { type: Number },
tstamp: { type: String },
sequence: { type: Number },
point: { type: Number },
remarks: { type: String },
labels: { type: Array, default: () => [] },
latitude: { type: Number },
longitude: { type: Number },
loading: { type: Boolean, default: false }
},
data: () => ({
dateMenu: false,
timeMenu: false,
remarksMenu: false,
search: '',
entryLabels: [],
tsDate: null,
tsTime: null,
entrySequence: null,
entryPoint: null,
entryRemarks: null,
entryRemarksInput: null,
entryLatitude: null,
entryLongitude: null
}),
computed: {
remarksAvailable () {
return this.entryRemarksInput == this.entryRemarks?.text ||
this.entryRemarksInput == this.entryRemarks
? []
: flattenRemarks(this.presetRemarks);
},
allSelected () {
return this.entryLabels.length === this.items.length
},
dirty () {
// Selected remark distinct from input remark
if (this.entryRemarksText != this.remarks) {
return true;
}
// The user is editing the remarks
if (this.entryRemarksText != this.entryRemarksInput) {
return true;
}
// Selected label set distinct from input labels
if (distinctSets(this.selectedLabels, this.entryLabels, (i) => i.text)) {
return true;
}
// Selected seqpoint distinct from input seqpoint (if seqpoint present)
if ((this.entrySequence || this.entryPoint)) {
if (this.entrySequence != this.sequence || this.entryPoint != this.point) {
return true;
}
} else {
// Selected timestamp distinct from input timestamp (if no seqpoint)
const epoch = Date.parse(this.tstamp);
const entryEpoch = Date.parse(`${this.tsDate} ${this.tsTime}Z`);
// Ignore difference of less than one second
if (Math.abs(entryEpoch - epoch) > 1000) {
return true;
}
}
return false;
},
canSave () {
// There is either tstamp or seqpoint, latter wins
if (!(this.entrySequence && this.entryPoint) && !this.entryTstamp) {
return false;
}
// There are remarks and/or labels
if (!this.entryRemarksText && !this.entryLabels.length) {
return false;
}
// Form is dirty
if (!this.dirty) {
return false;
}
return true;
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.availableLabels).map(this.labelToItem);
},
selectedLabels () {
return this.event?.labels ?? [];
},
entryTstamp () {
const ts = new Date(Date.parse(`${this.tsDate} ${this.tsTime}Z`));
if (isNaN(ts)) {
return null;
}
return ts.toISOString();
},
entryRemarksText () {
return typeof this.entryRemarks === 'string'
? this.entryRemarks
: this.entryRemarks?.text;
}
},
watch: {
value () {
if (this.value) {
// Populate fields from properties
if (!this.tstamp && !this.sequence && !this.point) {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
} else if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
}
// NOTE Dead code
if (this.meta?.geometry?.type == "Point") {
this.entryLongitude = this.meta.geometry.coordinates[0];
this.entryLatitude = this.meta.geometry.coordinates[1];
}
this.entryLatitude = this.latitude;
this.entryLongitude = this.longitude;
this.entrySequence = this.sequence;
this.entryPoint = this.point;
this.entryRemarks = this.remarks;
this.entryLabels = [...(this.labels??[])];
// Focus remarks field
this.$nextTick(() => this.$refs.remarks.focus());
}
},
tstamp () {
if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
} else if (this.sequence || this.point) {
this.tsDate = null;
this.tsTime = null;
} else {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
}
},
sequence () {
if (this.sequence && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
point () {
if (this.point && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
entryTstamp (n, o) {
//this.updateAncillaryData();
},
entrySequence (n, o) {
//this.updateAncillaryData();
},
entryPoint (n, o) {
//this.updateAncillaryData();
},
entryRemarks () {
if (this.entryRemarks?.labels) {
this.entryLabels = [...this.entryRemarks.labels];
} else if (!this.entryRemarks) {
this.entryLabels = [];
}
},
selectedLabels () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
},
entryLabels () {
this.search = '';
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.availableLabels[k].view?.icon,
colour: this.availableLabels[k].view?.colour,
title: this.availableLabels[k].view?.description
};
},
searchRemarks (item, queryText, itemText) {
const needle = queryText.toLowerCase();
const text = item.text.toLowerCase();
const keywords = item.keywords.map(i => i.toLowerCase());
const labels = item.labels.map(i => i.toLowerCase());
return text.includes(needle) ||
keywords.some(i => i.includes(needle)) ||
labels.some(i => i.includes(needle));
},
handleRemarksMenu (event) {
if (typeof event == 'boolean') {
this.remarksMenu = event;
} else {
this.entryRemarks = event;
this.remarksMenu = false;
}
},
async getPointData () {
const url = `/project/${this.$route.params.project}/sequence/${this.entrySequence}/${this.entryPoint}`;
return await this.api([url]);
},
async getTstampData () {
const url = `/navdata?q=tstamp:${this.entryTstamp}&tolerance:2500`;
return await this.api([url]);
},
async updateAncillaryData () {
if (this.entrySequence && this.entryPoint) {
// Fetch data for this sequence / point
const data = await this.getPointData();
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.geometry) {
this.entryLongitude = (data?.geometry?.coordinates??[])[0];
this.entryLatitude = (data?.geometry?.coordinates??[])[1];
}
} else if (!this.entrySequence && !this.entryPoint && this.entryTstamp) {
// Fetch data for this timestamp
const data = ((await this.getTstampData())??[])[0];
console.log("TS DATA", data);
if (data?._sequence && data?.shot) {
this.entrySequence = Number(data._sequence);
this.entryPoint = data.shot;
}
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.longitude && data?.latitude) {
this.entryLongitude = data.longitude;
this.entryLatitude = data.latitude;
}
}
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);
},
save () {
// In case the focus goes directly from the remarks field
// to the Save button.
if (this.entryRemarksInput != this.entryRemarksText) {
this.entryRemarks = this.entryRemarksInput;
}
const data = {
id: this.id,
remarks: this.entryRemarksText,
labels: this.entryLabels
};
/* NOTE This is the purist way.
* Where we expect that the server will match
* timestamps with shotpoints and so on
*
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
} else {
data.tstamp = this.entryTstamp;
}
*/
/* NOTE And this is the pragmatic way.
*/
data.tstamp = this.entryTstamp;
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
}
this.$emit("changed", data);
this.$emit("input", false);
},
...mapActions(["api"])
},
}
</script>

View File

@@ -11,7 +11,7 @@
<v-icon v-if="serverConnected" class="mr-6" small title="Connected to server">mdi-lan-connect</v-icon>
<v-icon v-else class="mr-6" small color="red" title="Server connection lost (we'll reconnect automatically when the server comes back)">mdi-lan-disconnect</v-icon>
<dougal-notifications-control class="mr-6"></dougal-notifications-control>
<div title="Night mode">
@@ -31,7 +31,7 @@
font-family: "Bank Gothic Medium";
src: local("Bank Gothic Medium"), url("/fonts/bank-gothic-medium.woff");
}
.brand {
font-family: "Bank Gothic Medium";
}
@@ -56,7 +56,7 @@ export default {
const date = new Date();
return date.getUTCFullYear();
},
...mapState({serverConnected: state => state.notify.serverConnected})
}
};

View File

@@ -50,7 +50,7 @@ import unpack from '@/lib/unpack.js';
export default {
name: 'DougalGraphArraysIJScatter',
props: [ "data", "settings" ],
data () {
@@ -62,15 +62,15 @@ export default {
histogram: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
if (newVal === null) {
this.busy = true;
@@ -79,46 +79,46 @@ export default {
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
histogram () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.histogram`]: this.histogram});
},
scatterplot () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.scatterplot`]: this.scatterplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.histogram) {
this.plotHistogram();
}
if (this.scatterplot) {
this.plotScatter();
}
},
plotSeries () {
if (!this.data) {
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
@@ -141,7 +141,7 @@ export default {
};
return data;
}
const data = [
transform(this.data.items, 1, {
xaxis: 'x',
@@ -155,7 +155,7 @@ export default {
})
];
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Inline / crossline error sequence %{meta.sequence}"},
@@ -177,25 +177,25 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[0] = Plotly.newPlot(this.$refs.graph0, data, layout, config);
},
plotScatter () {
console.log("plot");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
function transform (d) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
@@ -217,10 +217,10 @@ export default {
}];
return data;
}
const data = transform(this.data.items);
this.busy = false;
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
@@ -235,22 +235,22 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[1] = Plotly.newPlot(this.$refs.graph1, data, layout, config);
},
plotHistogram () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
@@ -271,7 +271,7 @@ export default {
};
return data;
}
const data = [
transform(this.data.items, 0, {
xaxis: 'x',
@@ -284,7 +284,7 @@ export default {
name: 'Inline'
})
];
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
@@ -308,7 +308,7 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
@@ -319,12 +319,12 @@ export default {
this.graph[2] = Plotly.newPlot(this.$refs.graph2, data, layout, config);
},
replot () {
if (!this.graph.length) {
return;
}
console.log("Replotting");
this.graph.forEach( (graph, idx) => {
const ref = this.$refs["graph"+idx];
@@ -334,23 +334,23 @@ export default {
});
});
},
},
async mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graph0);
this.resizeObserver.observe(this.$refs.graph1);
this.resizeObserver.observe(this.$refs.graph2);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graph2);

View File

@@ -6,7 +6,7 @@
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
@@ -49,7 +49,7 @@ import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data", "settings" ],
data () {
@@ -62,16 +62,16 @@ export default {
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
@@ -79,42 +79,42 @@ export default {
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
@@ -122,11 +122,11 @@ export default {
const gunDepths = guns.map(s => s.map(g => g[10]));
const gunDepthsSorted = gunDepths.map(s => d3a.sort(s));
const gunsAvgDepth = gunDepths.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunDepths = [{
type: "scatter",
mode: "lines",
@@ -150,7 +150,7 @@ export default {
y: gunDepthsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsDepthsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
@@ -166,22 +166,22 @@ export default {
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunDepths, tracesGunsDepthsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun depths sequence %{meta.sequence}"},
@@ -198,12 +198,12 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
@@ -220,7 +220,7 @@ export default {
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun depths shot %{meta.point}"},
height: 300,
@@ -236,19 +236,19 @@ export default {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
@@ -256,7 +256,7 @@ export default {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
@@ -277,21 +277,21 @@ export default {
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
@@ -307,21 +307,21 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
@@ -333,25 +333,25 @@ export default {
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);

View File

@@ -3,7 +3,7 @@
<v-card-title class="headline">
Gun details
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
@@ -37,7 +37,7 @@ import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data" ],
data () {
@@ -54,16 +54,16 @@ export default {
]
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
@@ -71,31 +71,31 @@ export default {
this.plot();
}
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
}
},
methods: {
plot () {
this.plotHeat();
},
async plotHeat () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (data, aspects=["Depth", "Pressure"]) {
const facets = [
// Mode
{
@@ -103,9 +103,9 @@ export default {
name: "Mode",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Off", "Auto", "Manual", "Disabled" ],
conversion: (gun, shot) => {
switch (gun[3]) {
case "A":
@@ -119,16 +119,16 @@ export default {
}
}
},
// Detect
{
params: {
name: "Detect",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Zero", "Peak", "Level" ],
conversion: (gun, shot) => {
switch (gun[4]) {
case "P":
@@ -140,41 +140,41 @@ export default {
}
}
},
// Autofire
{
params: {
name: "Autofire",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "False", "True" ],
conversion: (gun, shot) => {
return gun[5] ? 1 : 0;
}
},
// Aimpoint
{
params: {
name: "Aimpoint",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[7]
},
// Firetime
{
params: {
name: "Firetime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[8] : null
},
// Delta
{
params: {
@@ -187,7 +187,7 @@ export default {
zmin: -2,
zmax: 2
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[7]-gun[8] : null
},
@@ -197,7 +197,7 @@ export default {
name: "Delay",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[9]
},
@@ -207,7 +207,7 @@ export default {
name: "Depth",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} m"
},
conversion: (gun, shot) => gun[10]
},
@@ -217,7 +217,7 @@ export default {
name: "Pressure",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} psi"
},
conversion: (gun, shot) => gun[11]
},
@@ -227,7 +227,7 @@ export default {
name: "Volume",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} in³"
},
conversion: (gun, shot) => gun[12]
},
@@ -237,14 +237,14 @@ export default {
name: "Filltime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
// NOTE that filltime is applicable to the *non* firing guns
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? null : gun[13]
}
];
// Get gun numbers
const guns = [...new Set(data.map( s => s.meta.guns.map( g => g[1] ) ).flat())];
@@ -256,13 +256,13 @@ export default {
// ]
// }
const z = {};
// x is an array of shotpoints
const x = [];
// y is an array of gun numbers
const y = guns.map( gun => `G${gun}` );
// Build array of guns (i.e., populate z)
// We prefer to do this outside the shot-to-shot loop
// for efficiency
@@ -273,15 +273,15 @@ export default {
z[label][i] = [];
}
}
// Populate array of guns with shotpoint data
for (let shot of data) {
x.push(shot.point);
for (const facet of facets) {
const label = facet.params.name;
const facetGunsArray = z[label];
for (const gun of shot.meta.guns) {
const gunIndex = gun[1]-1;
const facetGun = facetGunsArray[gunIndex];
@@ -289,10 +289,10 @@ export default {
}
}
}
return aspects.map( (aspect, idx) => {
const facet = facets.find(el => el.params.name == aspect) || {};
const defaultParams = {
name: aspect,
type: "heatmap",
@@ -304,15 +304,15 @@ export default {
xaxis: "x",
yaxis: "y" + (idx > 0 ? idx+1 : "")
}
return Object.assign({}, defaultParams, facet.params);
});
}
const data = transform(this.data.items, this.aspects);
this.busy = false;
const layout = {
title: {text: "Gun details sequence %{meta.sequence}"},
height: 200*this.aspects.length,
@@ -327,15 +327,15 @@ export default {
*/
//autosize: true,
// colorscale: "sequential",
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
this.aspects.forEach ( (aspect, idx) => {
const num = idx+1;
const key = "yaxis" + num;
@@ -352,21 +352,21 @@ export default {
domain
}
});
const config = {
//editable: true,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphHeat, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
@@ -378,23 +378,23 @@ export default {
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphHeat);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphHeat);

View File

@@ -6,7 +6,7 @@
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
@@ -49,7 +49,7 @@ import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsPressure',
props: [ "data", "settings" ],
data () {
@@ -62,16 +62,16 @@ export default {
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
@@ -79,42 +79,42 @@ export default {
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
@@ -126,12 +126,12 @@ export default {
const gunsWeightedAvgPressure = gunPressures.map( (s, sidx) =>
d3a.sum(s.map( (pressure, gidx) => pressure * gunPressureWeights[sidx][gidx] )) / d3a.sum(gunPressureWeights[sidx])
);
const manifold = unpack(meta, "manifold");
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const traceManifold = {
name: "Manifold",
type: "scatter",
@@ -140,7 +140,7 @@ export default {
x,
y: manifold,
};
const tracesGunPressures = [{
type: "scatter",
mode: "lines",
@@ -164,7 +164,7 @@ export default {
y: gunPressuresSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsPressuresIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
@@ -180,22 +180,22 @@ export default {
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ traceManifold, ...tracesGunPressures, tracesGunsPressuresIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun pressures sequence %{meta.sequence}"},
@@ -212,12 +212,12 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
@@ -237,7 +237,7 @@ export default {
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun pressures shot %{meta.point}"},
height: 300,
@@ -253,19 +253,19 @@ export default {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
@@ -273,7 +273,7 @@ export default {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
@@ -294,21 +294,21 @@ export default {
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
@@ -324,21 +324,21 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
@@ -350,25 +350,25 @@ export default {
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);

View File

@@ -6,7 +6,7 @@
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
@@ -49,7 +49,7 @@ import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsTiming',
props: [ "data", "settings" ],
data () {
@@ -62,16 +62,16 @@ export default {
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
@@ -79,42 +79,42 @@ export default {
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
@@ -122,11 +122,11 @@ export default {
const gunTimings = guns.map(s => s.map(g => g[9]));
const gunTimingsSorted = gunTimings.map(s => d3a.sort(s));
const gunsAvgTiming = gunTimings.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunTimings = [{
type: "scatter",
mode: "lines",
@@ -150,7 +150,7 @@ export default {
y: gunTimingsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsTimingsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
@@ -166,22 +166,22 @@ export default {
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunTimings, tracesGunsTimingsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun timings sequence %{meta.sequence}"},
@@ -198,12 +198,12 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
@@ -220,7 +220,7 @@ export default {
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun timings shot %{meta.point}"},
height: 300,
@@ -236,19 +236,19 @@ export default {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
@@ -256,7 +256,7 @@ export default {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
@@ -277,21 +277,21 @@ export default {
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
@@ -307,21 +307,21 @@ export default {
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
@@ -333,25 +333,25 @@ export default {
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);

View File

@@ -1,21 +1,21 @@
<template>
<v-dialog v-model="open">
<template v-slot:activator="{ on, attrs }">
<v-btn icon v-bind="attrs" v-on="on" title="Configure visible aspects">
<v-icon small>mdi-wrench-outline</v-icon>
</v-btn>
</template>
<v-card>
<v-list nav subheader>
<v-subheader>Visualisations</v-subheader>
<v-list-item-group v-model="aspectsVisible" multiple>
<v-list-item value="DougalGraphGunsPressure">
<template v-slot:default="{ active }">
<v-list-item-action>
@@ -28,7 +28,7 @@
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsTiming">
<template v-slot:default="{ active }">
<v-list-item-action>
@@ -41,7 +41,7 @@
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsDepth">
<template v-slot:default="{ active }">
<v-list-item-action>
@@ -54,7 +54,7 @@
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsHeatmap">
<template v-slot:default="{ active }">
<v-list-item-action>
@@ -67,7 +67,7 @@
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphArraysIJScatter">
<template v-slot:default="{ active }">
<v-list-item-action>
@@ -83,14 +83,14 @@
</v-list-item-group>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn v-if="user" color="warning" text @click="save" :title="'Save as preference for user '+user.name+' on this computer (other users may have other defaults).'">Save as default</v-btn>
<v-spacer></v-spacer>
<v-btn color="primary" text @click="open=false">Close</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
@@ -102,20 +102,20 @@ import { mapActions, mapGetters } from 'vuex';
export default {
name: "DougalGraphSettingsSequence",
props: [
"aspects"
],
data () {
return {
open: false,
aspectsVisible: this.aspects || []
}
},
watch: {
aspects () {
// Update the aspects selection list iff the list
// is not currently open.
@@ -123,19 +123,19 @@ export default {
this.aspectsVisible = this.aspects;
}
}
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
methods: {
save () {
this.open = false;
this.$nextTick( () => this.$emit("update:aspects", {aspects: [...this.aspectsVisible]}) );
},
reset () {
this.aspectsVisible = this.aspects || [];
}

View File

@@ -3,7 +3,7 @@
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref">
<router-link v-for="sequence in sequences"
<router-link v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
@@ -32,12 +32,12 @@
min-height 16px
background-color #d3d3d314
border-radius 4px
.sequence
flex 1 1 auto
opacity 0.5
border-radius 4px
&.ntbp
background-color red
&.raw
@@ -54,13 +54,13 @@
export default {
name: 'DougalLineStatus',
props: {
preplot: Object,
sequences: Array,
"sequence-href": Function
},
methods: {
style (s) {
const values = {};
@@ -69,28 +69,28 @@ export default {
: s.status == "ntbp"
? (s.fsp_final || s.fsp)
: s.fsp; /* status == "raw" */
const lsp = s.status == "final"
? s.lsp_final
: s.status == "ntbp"
? (s.lsp_final || s.lsp)
: s.lsp; /* status == "raw" */
const pp0 = Math.min(this.preplot.fsp, this.preplot.lsp);
const pp1 = Math.max(this.preplot.fsp, this.preplot.lsp);
const len = pp1-pp0;
const sp0 = Math.max(Math.min(fsp, lsp), pp0);
const sp1 = Math.min(Math.max(fsp, lsp), pp1);
const left = (sp0-pp0)/len;
const right = 1-((sp1-pp0)/len);
values["margin-left"] = left*100 + "%";
values["margin-right"] = right*100 + "%";
return values;
},
title (s) {
const status = s.status == "final"
? "Final"
@@ -101,13 +101,13 @@ export default {
: s.status == "planned"
? "Planned"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
}
}
}
</script>

View File

@@ -12,7 +12,7 @@
<v-toolbar-title class="mx-2" @click="$router.push('/')" style="cursor: pointer;">Dougal</v-toolbar-title>
<v-spacer></v-spacer>
<v-menu bottom offset-y>
<template v-slot:activator="{on, attrs}">
<v-hover v-slot="{hover}">
@@ -29,17 +29,17 @@
</v-btn>
</v-hover>
</template>
<v-list dense>
<v-list-item :href="`/settings/equipment`">
<v-list-item-title>Equipment list</v-list-item-title>
<v-list-item-action><v-icon small>mdi-view-list</v-icon></v-list-item-action>
</v-list-item>
</v-list>
</v-menu>
<v-breadcrumbs :items="path"></v-breadcrumbs>
<template v-if="$route.name != 'Login'">

View File

@@ -1,5 +1,5 @@
export default function FormatTimestamp (str) {
const d = new Date(str);
if (isNaN(d)) {

View File

@@ -1,4 +1,4 @@
export default function unpack(rows, key) {
return rows && rows.map( row => row[key] );
};

View File

@@ -1,4 +1,4 @@
function withParentProps(item, parent, childrenKey, prop, currentValue) {
if (!Array.isArray(parent)) {
@@ -29,43 +29,43 @@ function withParentProps(item, parent, childrenKey, prop, currentValue) {
function dms (lat, lon) {
const λh = lat < 0 ? "S" : "N";
const φh = lon < 0 ? "W" : "E";
const λn = Math.abs(lat);
const φn = Math.abs(lon);
const λi = Math.trunc(λn);
const φi = Math.trunc(φn);
const λf = λn - λi;
const φf = φn - φi;
const λs = ((λf*3600)%60).toFixed(1);
const φs = ((φf*3600)%60).toFixed(1);
const λm = Math.trunc(λf*60);
const φm = Math.trunc(φf*60);
const λ =
String(λi).padStart(2, "0") + "°" +
String(λm).padStart(2, "0") + "'" +
String(λs).padStart(4, "0") + '" ' +
λh;
const φ =
String(φi).padStart(3, "0") + "°" +
String(φm).padStart(2, "0") + "'" +
String(φs).padStart(4, "0") + '" ' +
φh;
return λ+" "+φ;
}
function geometryAsString (item, opts = {}) {
const key = "key" in opts ? opts.key : "geometry";
const formatDMS = opts.dms;
let str = "";
if (key in item) {
const geometry = item[key];
if (geometry && "coordinates" in geometry) {
@@ -76,7 +76,7 @@ function geometryAsString (item, opts = {}) {
str = `${geometry.coordinates[1].toFixed(6)}, ${geometry.coordinates[0].toFixed(6)}`;
}
}
if (str) {
if (opts.url) {
if (typeof opts.url === 'string') {
@@ -88,7 +88,7 @@ function geometryAsString (item, opts = {}) {
}
}
}
return str;
}
@@ -117,10 +117,10 @@ function geometryAsString (item, opts = {}) {
* not exist or is not searched for.
*/
function preferencesλ (preferences) {
return function (key, defaults={}) {
const keys = Object.keys(preferences).filter(str => str.startsWith(key+".") || str == key);
const settings = {...defaults};
for (const str of keys) {
const k = str == key ? str : str.substring(key.length+1);
@@ -130,7 +130,7 @@ function preferencesλ (preferences) {
return settings;
}
}

View File

@@ -32,4 +32,19 @@ function point (state) {
return Number(v) || v;
}
export default { serverEvent, online, lineName, sequence, line, point };
function position (state) {
const λ = Number(_(state, "serverEvent.payload.new.meta.longitude"));
const φ = Number(_(state, "serverEvent.payload.new.meta.latitude"));
if (!isNaN(λ) && !isNaN(φ)) {
return [ λ, φ ];
}
return null;
}
function timestamp (state) {
const v = _(state, "serverEvent.payload.new.meta.time");
return v;
}
export default { serverEvent, online, lineName, sequence, line, point, position, timestamp };

View File

@@ -1,4 +1,4 @@
function setProjectId (state, pid) {
state.projectId = pid;
}

View File

@@ -1,4 +1,4 @@
function showSnack({commit}, [text, colour]) {
commit('setSnackColour', colour || 'primary');
commit('setSnackText', text);

View File

@@ -1,4 +1,4 @@
function setSnackText (state, text) {
state.snackText = text;
}

View File

@@ -21,7 +21,7 @@ async function logout ({commit, dispatch}) {
commit('setUser', null);
// Should delete JWT cookie
await dispatch('api', ["/logout"]);
// Clear preferences
commit('setPreferences', {});
}
@@ -61,16 +61,16 @@ function setCredentials ({state, commit, getters, dispatch}, force = false) {
*/
function saveUserPreference ({state, commit}, [key, value]) {
const k = `${state.user?.name}.${state.user?.role}.${key}`;
if (value !== undefined) {
localStorage.setItem(k, JSON.stringify(value));
const preferences = state.preferences;
preferences[key] = value;
commit('setPreferences', preferences);
} else {
localStorage.removeItem(k);
const preferences = state.preferences;
delete preferences[key];
commit('setPreferences', preferences);
@@ -81,7 +81,7 @@ async function loadUserPreferences ({state, commit}) {
// Get all keys which are of interest to us
const prefix = `${state.user?.name}.${state.user?.role}`;
const keys = Object.keys(localStorage).filter( k => k.startsWith(prefix) );
// Build the preferences object
const preferences = {};
keys.map(str => {
@@ -89,7 +89,7 @@ async function loadUserPreferences ({state, commit}) {
const key = str.split(".").slice(2).join(".");
preferences[key] = value;
});
// Commit it
commit('setPreferences', preferences);
}

View File

@@ -1,4 +1,4 @@
function setCookie (state, cookie) {
state.cookie = cookie;
}

View File

@@ -223,7 +223,7 @@
<v-subheader v-if="!selectedItemHistory || !selectedItemHistory.length"
class="justify-center"
>No more history</v-subheader>
<v-card v-for="item in selectedItemHistory" class="mt-5">
<v-card v-for="(item, key) in selectedItemHistory" class="mt-5" :key="key">
<v-card-title>{{selectedItem.kind}}</v-card-title>
<v-card-subtitle class="text-caption">{{item.tstamp}}</v-card-subtitle>
<v-card-text>

View File

@@ -35,14 +35,14 @@ import { mapActions } from 'vuex';
export default {
name: "FeedViewer",
data () {
return {
timer: null,
feed: {}
}
},
methods: {
parse (text) {
const data = {items:[]};
@@ -50,13 +50,13 @@ export default {
const xml = parser.parseFromString(text, "application/xml");
const feed = xml.getElementsByTagNameNS("http://www.w3.org/2005/Atom", "feed")[0];
const entries = feed.getElementsByTagName("entry");
data.title = feed.getElementsByTagName("title")[0].childNodes[0].textContent;
data.updated = feed.getElementsByTagName("updated")[0].childNodes[0].textContent;
data.link = [...feed.getElementsByTagName("link")].filter(i =>
i.getAttribute("type") == "text/html"
).pop().getAttribute("href");
data.items = [...entries].map(entry => {
const item = {};
const link = entry.getElementsByTagName("link")[0];
@@ -70,18 +70,18 @@ export default {
}
const summaries = entry.getElementsByTagName("summary");
const summary = [...summaries].find(i => i.getAttribute("type") == "xhtml") || summaries[0];
item.summary = summary.innerHTML;
item.id = entry.getElementsByTagName("id")[0].childNodes[0].textContent;
item.title = entry.getElementsByTagName("title")[0].childNodes[0].textContent;
item.updated = entry.getElementsByTagName("updated")[0].childNodes[0].textContent;
return item;
});
return data;
},
/** Try to fix idiosyncrasies and XML bugs in the source.
*/
fixText (text) {
@@ -89,7 +89,7 @@ export default {
// element in the source.
return text.replace(/(<hr( [^>]*)?>)/g, "$1</hr>")
},
async refresh () {
const text = await this.api([`/rss/?remote=${atob(this.$route.params.source)}`, {text:true}]);
try {
@@ -100,15 +100,15 @@ export default {
this.feed = this.parse(this.fixText(text));
}
},
...mapActions(["api"])
},
async mounted () {
await this.refresh();
this.timer = setInterval(this.refresh, 300000);
},
unmounted () {
cancelInterval(this.timer);
this.timer = null;

View File

@@ -1,17 +1,17 @@
<template>
<v-card>
<v-toolbar v-if="$route.params.sequence" class="fixed">
<v-toolbar-title>
Sequence {{$route.params.sequence}}
</v-toolbar-title>
<v-spacer></v-spacer>
<dougal-graph-settings-sequence :aspects="aspects" @update:aspects="configure">
</dougal-graph-settings-sequence>
<v-btn icon
:disabled="!($route.params.sequence > firstSequence)"
:to="{name: 'graphsBySequence', params: { sequence: firstSequence }}"
@@ -35,7 +35,7 @@
<v-icon>mdi-debug-step-over</v-icon>
</v-btn>
</template>
<v-list>
<v-list-item>
<v-autocomplete
@@ -63,34 +63,34 @@
>
<v-icon>mdi-skip-forward</v-icon>
</v-btn>
</v-toolbar>
<v-toolbar v-else-if="$route.params.sequence0">
<v-toolbar-title>
Sequences {{$route.params.sequence0}}{{$route.params.sequence1}}
</v-toolbar-title>
</v-toolbar>
<v-toolbar v-else-if="$route.params.date">
<v-toolbar-title>
Date {{$route.params.date}}
</v-toolbar-title>
</v-toolbar>
<v-toolbar v-else-if="$route.params.date0">
<v-toolbar-title>
Dates {{$route.params.date0}}{{$route.params.date1}}
</v-toolbar-title>
</v-toolbar>
<v-toolbar flat>
<!--
This is a ghost toolbar so that elements further down in the page are
@@ -98,7 +98,7 @@
-->
</v-toolbar>
<v-container>
<v-container v-if="sequences.length">
<v-row v-for="(item, idx) in visibleItems" :key="idx">
<v-col>
<component
@@ -110,6 +110,12 @@
</v-col>
</v-row>
</v-container>
<v-card-text class="text-center" v-else-if="!loading">
This project has no sequences.
</v-card-text>
<v-card-text class="text-center" v-else>
Loading
</v-card-text>
</v-card>
</template>
@@ -146,7 +152,7 @@ import DougalGraphSettingsSequence from '@/components/graph-settings-sequence.vu
export default {
name: "Graphs",
components: {
DougalGraphSettingsSequence,
DougalGraphArraysIJScatter,
@@ -155,7 +161,7 @@ export default {
DougalGraphGunsDepth,
DougalGraphGunsHeatmap
},
data () {
const items = [
{
@@ -176,7 +182,7 @@ export default {
}
}
];
return {
items,
data: null,
@@ -185,60 +191,60 @@ export default {
aspects: items.map(i => i.component)
};
},
watch: {
preferences () {
this.configure(preferencesλ(this.preferences)(this.$options.name, {aspects: this.aspects}))
}
},
computed: {
getRows() {
return Array(this.rows).fill().map( (el, idx) => idx );
},
getCols () {
return Array(this.cols).fill().map( (el, idx) => idx );
},
visibleItems () {
return this.items.filter( i => this.aspects.includes(i.component) );
},
firstSequence () {
return this.sequences[this.sequences.length-1]?.sequence;
},
prevSequence () {
const seq = Number(this.$route.params.sequence);
const val = this.sequences
.filter(i => i.sequence < seq)
.map(i => i.sequence)
.reduce( (acc, cur) => Math.max(acc, cur), -Infinity);
return isFinite(val) ? val : undefined;
},
nextSequence () {
const seq = Number(this.$route.params.sequence);
const val = this.sequences
.filter(i => i.sequence > seq)
.map(i => i.sequence)
.reduce( (acc, cur) => Math.min(acc, cur), +Infinity);
return isFinite(val) ? val : undefined;
},
lastSequence () {
return this.sequences[0]?.sequence;
},
...mapGetters(['user', 'preferences', 'writeaccess', 'loading', 'serverEvent'])
},
methods: {
configure (data) {
if ("aspects" in data) {
this.aspects = [...data.aspects];
@@ -247,7 +253,7 @@ export default {
this.saveUserPreference([`${this.$options.name}.${key}`, data[key]]);
}
},
attributesFor (item) {
return this.data
? Object.assign({
@@ -256,37 +262,37 @@ export default {
}, item?.attributes)
: null;
},
preferencesFor (key, defaults) {
return preferencesλ(this.preferences)(`${this.$options.name}.${key}`, defaults);
},
gotoSequence(seq) {
this.$route.params.sequence = seq;
},
...mapActions(["api", "showSnack", "saveUserPreference"])
},
beforeRouteLeave (to, from, next) {
this.data = null;
console.log("beforeRouteLeave");
next();
},
async beforeRouteUpdate (to, from, next) {
console.log("beforeRouteUpdate");
this.data = null;
next();
const url = `/project/${this.$route.params.project}/sequence/${this.$route.params.sequence}?project=sequence,point,tstamp,geometrypreplot,errorraw,errorfinal,meta&path=$.raw.smsrc`;
this.data = Object.freeze(await this.api([url]));
this.data = Object.freeze(await this.api([url]));
this.sequences = await this.api([`/project/${this.$route.params.project}/sequence`]);
},
async beforeRouteEnter (to, from, next) {
console.log("beforeRouteEnter enter");
next( async vm => {
if (vm.$route.params.sequence) {
const url = `/project/${vm.$route.params.project}/sequence/${vm.$route.params.sequence}?project=sequence,point,tstamp,geometrypreplot,errorraw,errorfinal,meta&path=$.raw.smsrc`;
@@ -299,33 +305,37 @@ export default {
if (!vm.sequences.length) {
vm.sequences = await vm.api([`/project/${vm.$route.params.project}/sequence`]);
}
vm.$router.push({name: "graphsBySequence", params: {
project: vm.$route.params.project,
sequence: vm.sequences[0]?.sequence
}});
if (vm.sequences.length) { // Check that the project has at least one sequence
vm.$router.push({name: "graphsBySequence", params: {
project: vm.$route.params.project,
sequence: vm.sequences[0]?.sequence
}});
}
}
console.log("beforeRouteEnter exit");
});
},
async mounted () {
console.log("Graphs mounted");
this.sequences = await this.api([`/project/${this.$route.params.project}/sequence`]);
if (!this.$route.params.sequence) {
this.$router.push({name: "graphsBySequence", params: {
project: this.$route.params.project,
sequence: this.sequences[0]?.sequence
}});
}
const url = `/project/${this.$route.params.project}/sequence/${this.$route.params.sequence}?project=sequence,point,tstamp,geometrypreplot,errorraw,errorfinal,meta&path=$.raw.smsrc`;
this.data = Object.freeze(await this.api([url]));
if (this.sequences && this.sequences.length) {
if (!this.$route.params.sequence) {
this.$router.push({name: "graphsBySequence", params: {
project: this.$route.params.project,
sequence: this.sequences[0]?.sequence
}});
}
const url = `/project/${this.$route.params.project}/sequence/${this.$route.params.sequence}?project=sequence,point,tstamp,geometrypreplot,errorraw,errorfinal,meta&path=$.raw.smsrc`;
this.data = Object.freeze(await this.api([url]));
}
console.log("Mount finished");
}
}
</script>

View File

@@ -15,7 +15,7 @@
</v-toolbar>
</v-card-title>
<v-card-text>
<v-menu v-if="writeaccess"
v-model="contextMenuShow"
:position-x="contextMenuX"
@@ -63,7 +63,7 @@
</v-list-item>
</v-list>
</v-menu>
<v-dialog
v-model="colourPickerShow"
max-width="300"
@@ -118,7 +118,7 @@
@click:row="setActiveItem"
@contextmenu:row="contextMenu"
>
<template v-slot:item.status="{item}">
<dougal-line-status
:preplot="item"
@@ -143,7 +143,7 @@
<template v-slot:item.azimuth="props">
<span>{{ props.value.toFixed(2) }} °</span>
</template>
<template v-slot:item.remarks="{item}">
<v-text-field v-if="edit && edit.line == item.line && edit.key == 'remarks'"
type="text"
@@ -169,9 +169,9 @@
</div>
</template>
</v-data-table>
</v-card-text>
</v-card>
</v-container>
@@ -192,7 +192,7 @@ import DougalLineStatus from '@/components/line-status';
export default {
name: "LineList",
components: {
DougalLineStatus
},
@@ -258,13 +258,13 @@ export default {
edit: null, // {line, key, value}
queuedReload: false,
itemsPerPage: 25,
// Context menu stuff
contextMenuShow: false,
contextMenuX: 0,
contextMenuY: 0,
contextMenuItem: null,
// Colour picker stuff
colourPickerShow: false,
selectedColour: null,
@@ -275,17 +275,17 @@ export default {
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
watch: {
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.line == oldVal.line);
// Get around this Vuetify feature
// https://github.com/vuetifyjs/vuetify/issues/4144
if (oldVal.value === null) oldVal.value = "";
if (item && item[oldVal.key] != oldVal.value) {
if (await this.saveItem(oldVal)) {
item[oldVal.key] = oldVal.value;
@@ -317,51 +317,51 @@ export default {
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getLines();
this.getSequences();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getLines();
this.getSequences();
}
},
itemsPerPage (newVal, oldVal) {
localStorage.setItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/items-per-page`, newVal);
},
user (newVal, oldVal) {
this.itemsPerPage = Number(localStorage.getItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/items-per-page`)) || 25;
}
},
methods: {
itemClass (item) {
const colourClass = item.meta.colour ? "bg-clr-"+item.meta.colour.slice(1) : null;
if (colourClass && ![...this.styles.cssRules].some(i => i.selectorText == "."+colourClass)) {
const rule = `.${colourClass} { background-color: ${item.meta.colour}; }`;
this.styles.insertRule(rule);
}
return [
item.meta.colour ? colourClass : "",
(this.activeItem == item && !this.edit) ? 'blue accent-1 elevation-3' : ''
];
},
isPlanned(item) {
return this.sequences.find(i => i.line == item.line && i.status == 'planned');
},
contextMenu (e, {item}) {
e.preventDefault();
this.contextMenuShow = false;
@@ -370,7 +370,7 @@ export default {
this.contextMenuItem = item;
this.$nextTick( () => this.contextMenuShow = true );
},
setNTBA () {
this.removeFromPlan();
this.saveItem({
@@ -379,7 +379,7 @@ export default {
value: !this.contextMenuItem.ntba
})
},
setComplete () {
this.saveItem({
line: this.contextMenuItem.line,
@@ -414,21 +414,21 @@ export default {
await this.api([url, init]);
}
},
showLineColourDialog () {
this.selectedColour = this.contextMenuItem.meta.colour
? {hexa: this.contextMenuItem.meta.colour}
: null;
this.colourPickerShow = true;
},
setLineColour () {
const items = this.selectOn ? this.selectedRows : [ this.contextMenuItem ];
const colour = this.selectedColour ? this.selectedColour.hex+"80" : null;
this.selectedRows = [];
this.selectOn = false;
for (const item of items) {
if (colour) {
item.meta.colour = colour;
@@ -439,7 +439,7 @@ export default {
this.colourPickerShow = false;
}
},
editItem (item, key) {
this.edit = {
line: item.line,
@@ -447,10 +447,10 @@ export default {
value: item[key]
}
},
async saveItem (edit) {
if (!edit) return;
try {
const url = `/project/${this.$route.params.project}/line/${edit.line}`;
const init = {
@@ -459,7 +459,7 @@ export default {
[edit.key]: edit.value
}
};
let res;
await this.api([url, init, (e, r) => res = r]);
return res && res.ok;
@@ -481,11 +481,11 @@ export default {
this.items = await this.api([url]) || [];
},
async getSequences () {
const urlS = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([urlS]) || [];
const urlP = `/project/${this.$route.params.project}/plan`;
const planned = await this.api([urlP]) || [];
planned.forEach(i => i.status = "planned");
@@ -505,7 +505,7 @@ export default {
this.getLines();
this.getNumLines();
this.getSequences();
// Initialise stylesheet
const el = document.createElement("style");
document.head.appendChild(el);

File diff suppressed because it is too large Load Diff

View File

@@ -165,15 +165,15 @@ const layers = {
onEachFeature (feature, layer) {
const p = feature.properties;
if (feature.geometry) {
const ntbp = p.ntbp
? " <b>(NTBP)</b>"
: "";
const remarks = p.remarks
? "<hr/>"+markdown(p.remarks)
: "";
const popup = feature.geometry.type == "Point"
? `Raw sequence ${feature.properties.sequence}${ntbp}<br/>Point <b>${feature.properties.line} / ${feature.properties.point}</b><br/>${feature.properties.objref}<br/>${feature.properties.tstamp}`
: `Raw sequence ${p.sequence}${ntbp}<br/>
@@ -183,7 +183,7 @@ const layers = {
${p.duration}<br/>
<table><tr><td><b>${p.fsp}</b></td><td>@ ${ftstamp(p.ts0)}</td></tr><tr><td><b>${p.lsp}</b></td><td>@ ${ftstamp(p.ts1)}</td></tr></table>${remarks}`;
layer.bindTooltip(popup, {sticky: true});
}
}
}),
@@ -204,7 +204,7 @@ const layers = {
},
onEachFeature (feature, layer) {
const p = feature.properties;
const remarks = p.remarks
? "<hr/>"+markdown(p.remarks)
: "";
@@ -221,9 +221,9 @@ const layers = {
layer.bindTooltip(popup, {sticky: true});
}
}),
"Events (QC)": L.geoJSON(null),
"Events (Other)": L.geoJSON(null),
"Real-time": L.realtime({
@@ -394,7 +394,7 @@ export default {
}
}
},
user (newVal, oldVal) {
if (newVal && (!oldVal || newVal.name != oldVal.name)) {
this.initView();
@@ -416,7 +416,7 @@ export default {
//console.log("EVENT", event);
}
},
$route (to, from) {
if (to.name == "map") {
this.setHashMarker();
@@ -431,13 +431,13 @@ export default {
const bbox = new L.GeoJSON(res);
map.fitBounds(bbox.getBounds());
},
getEvents (ffn = i => true) {
return async (success, error) => {
const url = `/project/${this.$route.params.project}/event`;
const data = await this.api([url, {headers: {"Accept": "application/geo+json"}}]);
if (data) {
function colour(feature) {
if (feature && feature.properties && feature.properties.type) {
if (feature.properties.type == "qc") {
@@ -452,7 +452,7 @@ export default {
}
return "brown";
}
const features = data.filter(ffn).map(feature => {
feature.properties.colour = colour(feature);
return feature;
@@ -480,15 +480,15 @@ export default {
for (const l of this.layerRefreshConfig.filter(i => !layerset || layerset.includes(i.layer))) {
if (map.hasLayer(l.layer)) {
const url = l.url(query);
// Skip unnecessary requests
if (url == l.layer.lastRequestURL) continue;
if (l.layer.abort && l.layer.abort instanceof AbortController) {
l.layer.abort.abort();
}
l.layer.abort = new AbortController();
const signal = l.layer.abort.signal;
const init = {
@@ -497,7 +497,7 @@ export default {
Accept: "application/geo+json"
}
};
// Firing all refresh events asynchronously, which is OK provided
// we don't have hundreds of layers to be refreshed.
this.api([url, init])
@@ -505,11 +505,11 @@ export default {
if (!layer) {
return;
}
if (typeof l.transform == 'function') {
layer = l.transform(layer);
}
l.layer.clearLayers();
if (layer instanceof L.Layer || (layer.features && layer.features.length < limit) || ("length" in layer && layer.length < limit)) {
if (l.layer.addData) {
@@ -517,7 +517,7 @@ export default {
} else if (l.layer.addLayer) {
l.layer.addLayer(layer);
}
l.layer.lastRequestURL = url;
} else {
console.warn("Too much data from", url);
@@ -551,7 +551,7 @@ export default {
} else {
value = `${zoom}/${lat}/${lng}`;
}
if (value) {
localStorage.setItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/view`, value);
}
@@ -559,11 +559,11 @@ export default {
decodeURL () {
const value = localStorage.getItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/view`);
if (!value) {
return {};
}
const parts = value.split(":");
const activeOverlays = parts.length > 1 && parts[1].split(";");
const activeLayers = parts.length > 2 && parts[2].split(";");
@@ -574,19 +574,19 @@ export default {
return {position, activeOverlays, activeLayers};
},
initView () {
if (!map) {
return;
}
map.off('overlayadd', this.updateURL);
map.off('overlayremove', this.updateURL);
map.off('layeradd', this.updateURL);
map.off('layerremove', this.updateURL);
const init = this.decodeURL();
if (init.activeOverlays) {
Object.keys(tileMaps).forEach(k => {
const l = tileMaps[k];
@@ -621,16 +621,16 @@ export default {
if (init.position) {
map.setView(init.position.slice(1), init.position[0]);
}
map.on('overlayadd', this.updateURL);
map.on('overlayremove', this.updateURL);
map.on('layeradd', this.updateURL);
map.on('layerremove', this.updateURL);
},
setHashMarker () {
const crosshairsMarkerIcon = L.divIcon({
iconSize: [20, 20],
iconAnchor: [10, 10],
@@ -643,7 +643,7 @@ export default {
</svg>
`
});
const updateMarker = (latlng) => {
if (this.hashMarker) {
if (latlng) {
@@ -657,7 +657,7 @@ export default {
this.hashMarker.addTo(map).getElement().style.fill = "fuchsia";
}
}
const parts = document.location.hash.substring(1).split(":")[0].split("/").map(p => decodeURIComponent(p));
if (parts.length == 3) {
setTimeout(() => map.setView(parts.slice(1).reverse(), parts[0]), 500);
@@ -677,7 +677,7 @@ export default {
mounted () {
map = L.map('map', {maxZoom: 22});
const eventsOptions = () => {
return {
start: false,
@@ -703,7 +703,7 @@ export default {
}
}
};
layers["Events (QC)"] = L.realtime(this.getEvents(i => i.properties.type == "qc"), eventsOptions());
layers["Events (Other)"] = L.realtime(this.getEvents(i => i.properties.type != "qc"), eventsOptions());
@@ -729,7 +729,7 @@ export default {
//console.log("Events (Other) remove event", e);
});
const init = this.decodeURL();
if (init.activeOverlays) {
@@ -828,7 +828,7 @@ export default {
});
(new LoadingControl({position: "bottomright"})).addTo(map);
// Decode a position if one given in the hash
this.setHashMarker();
}

View File

@@ -4,7 +4,7 @@
<v-card-title>
<v-toolbar flat>
<v-toolbar-title>Plan</v-toolbar-title>
<v-menu v-if="items">
<template v-slot:activator="{on, attrs}">
<v-btn class="ml-5" small v-on="on" v-bind="attrs">
@@ -12,7 +12,7 @@
<v-icon right small>mdi-cloud-download</v-icon>
</v-btn>
</template>
<v-list>
<v-list-item
:href="`/api/project/${$route.params.project}/plan/?mime=text%2Fcsv&download`"
@@ -36,7 +36,7 @@
>PDF</v-list-item>
</v-list>
</v-menu>
<v-spacer></v-spacer>
<v-text-field
v-model="filter"
@@ -48,7 +48,7 @@
</v-toolbar>
</v-card-title>
<v-card-text>
<v-menu v-if="writeaccess"
v-model="contextMenuShow"
:position-x="contextMenuX"
@@ -63,7 +63,7 @@
</v-list-item>
</v-list>
</v-menu>
<v-card class="mb-5" flat>
<v-card-title class="text-overline">
Comments
@@ -77,7 +77,7 @@
>
<v-icon small>mdi-square-edit-outline</v-icon>
</v-btn>
<v-btn v-else
class="ml-3"
small
@@ -89,7 +89,7 @@
</v-btn>
</template>
</v-card-title>
<v-card-text v-if="editRemarks">
<v-textarea
v-model="remarks"
@@ -100,9 +100,9 @@
rows="1"
></v-textarea>
</v-card-text>
<v-card-text v-else v-html="$options.filters.markdown(remarks || '*(nil)*')"></v-card-text>
</v-card>
<v-data-table
@@ -121,7 +121,7 @@
<template v-slot:item.srss="{item}">
<v-icon small :title="srssInfo(item)">{{srssIcon(item)}}</v-icon>
</template>
<template v-slot:item.sequence="{item, value}">
<v-edit-dialog v-if="writeaccess"
large
@@ -253,7 +253,7 @@
<template v-slot:item.azimuth="props">
<span style="white-space:nowrap;">{{ props.value.toFixed(2) }} °</span>
</template>
<template v-slot:item.remarks="{item}">
<v-text-field v-if="writeaccess && edit && edit.sequence == item.sequence && edit.key == 'remarks'"
type="text"
@@ -322,9 +322,9 @@
</v-edit-dialog>
<span v-else>{{ Math.round(lagAfter(item) / (60*1000)) }} min</span>
</template>
</v-data-table>
</v-card-text>
</v-card>
</v-container>
@@ -339,7 +339,7 @@ import { mapActions, mapGetters } from 'vuex';
export default {
name: "Plan",
components: {
},
@@ -421,7 +421,7 @@ export default {
plannerConfig: null,
shiftAll: false, // Shift all sequences checkbox
// Context menu stuff
contextMenuShow: false,
contextMenuX: 0,
@@ -433,17 +433,17 @@ export default {
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
watch: {
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.sequence == oldVal.sequence);
// Get around this Vuetify feature
// https://github.com/vuetifyjs/vuetify/issues/4144
if (oldVal.value === null) oldVal.value = "";
if (item) {
if (item[oldVal.key] != oldVal.value) {
if (oldVal.key == "lagAfter") {
@@ -453,29 +453,29 @@ export default {
// Convert knots to metres per second
oldVal.value = oldVal.value*(1.852/3.6);
}
if (await this.saveItem(oldVal)) {
item[oldVal.key] = oldVal.value;
} else {
this.edit = oldVal;
}
}
}
}
},
async serverEvent (event) {
if (event.channel == "planned_lines" && event.payload.pid == this.$route.params.project) {
// Ignore non-ops
/*
if (event.payload.old === null && event.payload.new === null) {
return;
}
*/
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
@@ -491,34 +491,34 @@ export default {
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getPlannedLines();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getPlannedLines();
}
},
itemsPerPage (newVal, oldVal) {
localStorage.setItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/items-per-page`, newVal);
},
user (newVal, oldVal) {
this.itemsPerPage = Number(localStorage.getItem(`dougal/prefs/${this.user?.name}/${this.$route.params.project}/${this.$options.name}/items-per-page`)) || 25;
}
},
methods: {
suntimes (line) {
const oneday = 86400000;
function isDay (srss, ts, lat, lng) {
if (isNaN(srss.sunriseEnd) || isNaN(srss.sunsetStart)) {
// Between March and September
@@ -541,31 +541,31 @@ export default {
}
}
}
let {ts0, ts1} = line;
const [ lng0, lat0 ] = line.geometry.coordinates[0];
const [ lng1, lat1 ] = line.geometry.coordinates[1];
if (ts1-ts0 > oneday) {
console.warn("Cannot provide reliable sunrise / sunset times for lines over 24 hr in this version");
//return null;
}
const srss0 = suncalc.getTimes(ts0, lat0, lng0);
const srss1 = suncalc.getTimes(ts1, lat1, lng1);
srss0.prevDay = suncalc.getTimes(new Date(ts0.valueOf()-oneday), lat0, lng0);
srss1.nextDay = suncalc.getTimes(new Date(ts1.valueOf()+oneday), lat1, lng1);
srss0.isDay = isDay(srss0, ts0, lat0, lng0);
srss1.isDay = isDay(srss1, ts1, lat1, lng1);
return {
ts0: srss0,
ts1: srss1
};
},
srssIcon (line) {
const srss = this.suntimes(line);
const moon = suncalc.getMoonIllumination(line.ts0);
@@ -585,7 +585,7 @@ export default {
: 'mdi-moon-waning-crescent'
: 'mdi-theme-light-dark';
},
srssMoonPhase (line) {
const ts = new Date((Number(line.ts0)+Number(line.ts1))/2);
const moon = suncalc.getMoonIllumination(ts);
@@ -601,11 +601,11 @@ export default {
? 'Waning gibbous moon'
: 'Waning crescent moon';
},
srssInfo (line) {
const srss = this.suntimes(line);
const text = [];
try {
text.push(`Sunset at\t${srss.ts0.prevDay.sunset.toISOString().substr(0, 16)}Z (FSP)`);
text.push(`Sunrise at\t${srss.ts0.sunrise.toISOString().substr(0, 16)}Z (FSP)`);
@@ -622,11 +622,11 @@ export default {
console.log("ERROR", err);
}
}
if (!srss.ts0.isDay || !srss.ts1.isDay) {
text.push(this.srssMoonPhase(line));
}
return text.join("\n");
},
@@ -647,7 +647,7 @@ export default {
const v = item.length / ((item.ts1-item.ts0)/1000); // m/s
return v*3.6/1.852;
},
contextMenu (e, {item}) {
e.preventDefault();
this.contextMenuShow = false;
@@ -656,7 +656,7 @@ export default {
this.contextMenuItem = item;
this.$nextTick( () => this.contextMenuShow = true );
},
async deletePlannedSequence () {
console.log("Delete sequence", this.contextMenuItem.sequence);
const url = `/project/${this.$route.params.project}/plan/${this.contextMenuItem.sequence}`;
@@ -664,7 +664,7 @@ export default {
await this.api([url, init]);
await this.getPlannedLines();
},
editItem (item, key, value) {
this.edit = {
sequence: item.sequence,
@@ -672,10 +672,10 @@ export default {
value: value === undefined ? item[key] : value
}
},
async saveItem (edit) {
if (!edit) return;
try {
const url = `/project/${this.$route.params.project}/plan/${edit.sequence}`;
const init = {
@@ -684,7 +684,7 @@ export default {
[edit.key]: edit.value
}
};
let res;
await this.api([url, init, (e, r) => res = r]);
return res && res.ok;
@@ -692,7 +692,7 @@ export default {
return false;
}
},
async saveRemarks () {
const url = `/project/${this.$route.params.project}/info/plan/remarks`;
let res;
@@ -735,12 +735,12 @@ export default {
"defaultLineChangeDuration": 36
}
},
async getPlannerRemarks () {
const url = `/project/${this.$route.params.project}/info/plan/remarks`;
this.remarks = await this.api([url]) || "";
},
async getSequences () {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([url]) || [];

View File

@@ -65,26 +65,36 @@
<v-treeview
:items="filteredItems"
:open.sync="open"
item-key="serial"
item-key="_serial"
item-text="_text"
item-children="_children"
:open-on-click="true"
>
<template v-slot:label="{item}">
<div @dblclick.stop.prevent="toggleChildren(item)">
{{item.name}}
<v-chip v-if="item.children && itemCount(item)"
small
<div @dblclick.stop.prevent="toggleChildren(item)" v-if="item._kind=='test'">
<b>{{item._text}}</b>
<v-chip class="ml-2" v-if="item._children && itemCount(item)"
x-small
color="warning"
v-text="itemCount(item)"
>
</v-chip>
<v-chip v-for="label of item.labels"
<v-chip class="ml-2" v-else
x-small
color="success"
>
All passed
</v-chip>
<v-chip v-for="label of item.labels" :key="label"
class="mx-1"
small
:color="labels[label] && labels[label].view.colour"
:title="labels[label] && labels[label].view.description"
:close="writeaccess && label == 'QCAccepted'"
@click:close="unaccept(item)">
@click:close="unaccept(item)"
>
{{label}}
</v-chip>
@@ -121,6 +131,20 @@
</v-btn>
</v-hover>
</template>
</div>
<div :title="item.remarks" @dblclick.stop.prevent="toggleChildren(item)" v-else-if="item._kind=='sequence'">
{{item._text}}
<v-chip class="ml-2" v-if="item._children && itemCount(item)"
x-small
color="primary"
v-text="itemCount(item)"
>
</v-chip>
</div>
<div class="text--secondary" v-else>
{{item._text}}
</div>
</template>
@@ -174,12 +198,10 @@ export default {
sequences () {
function getSeq (item) {
return "_id" in item
? Array.isArray(item._id)
? Number(item._id[0])
: Number(item._id)
: "children" in item
? item.children.map(i => getSeq(i)).flat()
return item?._kind == "sequence"
? item.sequence
: item?._children?.length
? item._children.map(i => getSeq(i)).flat()
: undefined;
}
@@ -192,8 +214,8 @@ export default {
const values = [];
function filterResults (item) {
if (item.children) {
for (const child of item.children) {
if (item._children) {
for (const child of item._children) {
filterResults(child);
}
} else if (item._id && item.id) {
@@ -253,17 +275,17 @@ export default {
itemCount (item, count = 0) {
let sum = count;
if (item.children) {
sum += item.children.map(child => this.itemCount(child)).reduce( (a, b) => a+b, 0 )
if (item._children) {
sum += item._children.map(child => this.itemCount(child)).reduce( (a, b) => a+b, 0 )
} else {
sum++;
}
return sum;
},
accepted (item) {
if (item.children) {
return item.children.every(child => this.accepted(child));
if (item._children) {
return item._children.every(child => this.accepted(child));
}
if (item.labels) {
@@ -273,8 +295,8 @@ export default {
},
accept (item) {
if (item.children) {
for (const child of item.children) {
if (item._children) {
for (const child of item._children) {
this.accept(child);
}
return;
@@ -288,8 +310,8 @@ export default {
},
unaccept (item) {
if (item.children) {
for (const child of item.children) {
if (item._children) {
for (const child of item._children) {
this.unaccept(child);
}
return;
@@ -319,7 +341,6 @@ export default {
for (const path of sequences) {
const url = `/project/${this.$route.params.project}/meta/raw/sequences/${path}`;
const promise = this.api([url]).then(res => {
console.log("Apply QC labels (seq)", res);
for (const item of res) {
const obj = this.resultObjects.find(o => o.sequence == item.sequence &&
o.point == item.point &&
@@ -337,7 +358,6 @@ export default {
for (const path of points) {
const url = `/project/${this.$route.params.project}/meta/raw/points/${path}`;
const promise = this.api([url]).then(res => {
console.log("Apply QC labels (point)", res);
for (const item of res) {
const obj = this.resultObjects.find(o => o.sequence == item.sequence &&
o.point == item.point &&
@@ -358,27 +378,25 @@ export default {
async saveLabels () {
const url = `/project/${this.$route.params.project}/meta`;
console.log("Saving", this.resultObjects.filter(r => typeof r.value !== "undefined"));
const res = await this.api([url, {
method: "PUT",
body: this.resultObjects.filter(r => typeof r.value !== "undefined")
}]);
console.log("RES", res);
this.isDirty = false;
},
filterByText(item, queryText) {
if (!queryText || !item) return item;
if (item.children) {
if (item._children) {
const newItem = Object.assign({}, item);
newItem.children = item.children.map( child => this.filterByText(child, queryText) ).filter(i => !!i)
if (newItem.children.length > 0) {
newItem._children = item._children.map( child => this.filterByText(child, queryText) ).filter(i => !!i)
if (newItem._children.length > 0) {
return newItem;
}
}
if (item.name && item.name.toLowerCase().indexOf(queryText.toLowerCase()) > -1) {
if (item._text && item._text.toLowerCase().indexOf(queryText.toLowerCase()) > -1) {
return item;
}
},
@@ -386,16 +404,17 @@ export default {
filterBySequence(item, sequences) {
if (!sequences || !sequences.length) return item;
if (item._id) {
if ( (item._id.length > 1 && sequences.includes(item._id[0])) || sequences.includes(item) ) {
return item;
}
if (item._kind == "sequence" && (sequences.includes(item.sequence) || sequences.includes(item))) {
return item;
}
if (item.children) {
const newItem = Object.assign({}, item);
newItem.children = item.children.map( child => this.filterBySequence(child, sequences) ).filter(i => !!i);
if (newItem.children.length > 0) {
if (item._children) {
const newItem = {...item};
newItem._children = item._children.map(child =>
this.filterBySequence(child, sequences)
).filter(i => !!i);
if (newItem._children.length) {
return newItem;
}
}
@@ -403,71 +422,60 @@ export default {
toggleChildren (item, state) {
const open = typeof state == 'undefined'
? !this.open.includes(item.serial)
? !this.open.includes(item._serial)
: state;
if (item.children) {
item.children.forEach(child => this.toggleChildren(child, open));
if (item._children) {
item._children.forEach(child => this.toggleChildren(child, open));
}
if (open) {
if (!this.open.includes(item.serial)) {
this.open.push(item.serial);
if (!this.open.includes(item._serial)) {
this.open.push(item._serial);
}
} else {
const index = this.open.indexOf(item.serial);
const index = this.open.indexOf(item._serial);
if (index > -1) {
this.open.splice(index, 1);
}
}
},
transform (item, testId) {
item.serial = ++this.itemIndex;
if (item.id) {
testId = item.id;
} else {
item.id = testId;
}
if (item.check) {
switch (item.iterate) {
case "sequences":
item.check = item.check.map(check => ({
_id: check._id,
name: `Sequence ${check._id}: ${check.results}`
}));
break;
case "shots":
default:
const bySequence = {};
for (const check of item.check) {
if (!bySequence[check._id[0]]) {
bySequence[check._id[0]] = [];
}
bySequence[check._id[0]].push({
_id: check._id,
name: `Point ${check._id[1]}: ${check.results}`
});
}
item.check = Object.keys(bySequence).map(seq => ({
_id: seq,
name: `Sequence: ${seq}`,
children: bySequence[seq]
}));
}
if (!("children" in item)) {
item.children = item.check;
delete item.check;
}
}
if (item.children) {
for (const child of item.children) {
this.transform(child, testId);
}
if (item.check) {
item.children = item.check.concatenate(item.children);
}
transform (item, qcId) {
item._serial = ++this.itemIndex;
if (item.name && (item.check || item.children)) {
// This is probably a test
qcId ??= item.id;
item._kind = "test";
item._text = item.name;
item._children = [];
if (item.children) {
// Child tests
item._children = item.children.map(i => this.transform(i, qcId));
}
if (item.sequences) {
// In theory an item could have both subtests and its own results
// so we don't do an if … else but two independent ifs.
item._children = item._children.concat(item.sequences.map(i => this.transform(i, qcId)));
}
} else if (item.sequence && item.line) {
// This is probably a sequence
item._kind = "sequence";
item._text = `Sequence ${item.sequence}${item.meta?.qc && item.meta.qc[qcId] ? (": "+item.meta.qc[qcId]) : ""}`;
if (item.shots && item.shots.length) {
item._children = item.shots.map(i => this.transform(i, qcId));
}
} else if (item.sequence && item.point) {
// This is probably a shotpoint
item._kind = "point";
item._text = `Point ${item.point}: ${item.remarks}`
}
return item;
},
@@ -482,12 +490,12 @@ export default {
async getQCData () {
const url = `/project/${this.$route.params.project}/info/qc`;
const url = `/project/${this.$route.params.project}/qc/results`;
const res = await this.api([url]);
if (res) {
this.items = res.results.map(i => this.transform(i)) || [];
this.items = res.map(i => this.transform(i)) || [];
this.updatedOn = res.updatedOn;
await this.getQCLabels();
} else {

View File

@@ -1,6 +1,7 @@
module.exports = {
"transpileDependencies": [
"vuetify"
"vuetify",
"leaflet-arrowheads"
],
devServer: {
proxy: {

View File

@@ -6,8 +6,9 @@ const cookieParser = require('cookie-parser')
const maybeSendAlert = require("../lib/alerts");
const mw = require('./middleware');
const app = express();
const verbose = process.env.NODE_ENV != 'test';
const app = express();
app.locals.version = "0.3.0"; // API version
app.map = function(a, route){
route = route || '';
@@ -90,6 +91,11 @@ app.map({
'/project/:project/summary': {
get: [ mw.project.get ],
},
/*
* GIS endpoints
*/
'/project/:project/gis': {
get: [ mw.gis.project.bbox ]
},
@@ -105,6 +111,11 @@ app.map({
'/project/:project/gis/final/:featuretype(line|point)': {
get: [ mw.gis.project.final ]
},
/*
* Line endpoints
*/
'/project/:project/line/': {
get: [ mw.line.list ],
},
@@ -113,14 +124,25 @@ app.map({
patch: [ mw.auth.access.write, mw.line.patch ],
},
/*
* Sequence endpoints
*/
'/project/:project/sequence/': {
get: [ mw.sequence.list ],
},
'/project/:project/sequence/:sequence': {
get: [ mw.sequence.get ],
patch: [ mw.auth.access.write, mw.sequence.patch ],
'/:point': {
get: [ mw.sequence.point.get ]
}
},
/*
* Planner endpoints
*/
'/project/:project/plan/': {
get: [ mw.plan.list ],
put: [ mw.auth.access.write, mw.plan.put ],
@@ -131,23 +153,54 @@ app.map({
patch: [ mw.auth.access.write, mw.plan.patch ],
delete: [ mw.auth.access.write, mw.plan.delete ]
},
//
/*
* Event log endpoints
*/
'/project/:project/event/': {
get: [ mw.event.cache.get, mw.event.list, mw.event.cache.save ],
get: [ mw.event.list ],
post: [ mw.auth.access.write, mw.event.post ],
put: [ mw.auth.access.write, mw.event.put ],
delete: [ mw.auth.access.write, mw.event.delete ],
// TODO Rename -/:sequence → sequence/:sequence
'-/:sequence/': { // NOTE: We need to avoid conflict with the next endpoint ☹
get: [ mw.event.get ],
get: [ mw.event.sequence.get ],
},
':type/': {
':id/': {
// get: [ mw.event.get ],
put: [ mw.auth.access.write, mw.event.put ],
delete: [mw.auth.access.write, mw.event.delete ]
}
':id/': {
get: [ mw.event.get ],
put: [ mw.auth.access.write, mw.event.put ],
patch: [ mw.auth.access.write, mw.event.patch ],
delete: [mw.auth.access.write, mw.event.delete ]
},
},
/*
* QC endpoints
*/
'/project/:project/qc': {
'/results': {
// Get all QC results for :project
get: [ mw.qc.results.get ],
// Delete all QC results for :project
delete: [ mw.auth.access.write, mw.qc.results.delete ],
'/sequence/:sequence': {
// Get QC results for :project, :sequence
get: [ mw.qc.results.get ],
// Delete QC results for :project, :sequence
delete: [ mw.auth.access.write, mw.qc.results.delete ]
}
}
},
/*
* Other miscellaneous endpoints
*/
'/project/:project/label/': {
get: [ mw.label.list ],
// post: [ mw.label.post ],

View File

@@ -1,83 +0,0 @@
const { listen } = require('../../../ws/db');
// Event responses take a long time as we are querying a view
// which is the union of other views and non-optimised tables,
// so to speed things up a bit for the user we cache the
// results here.
// We do this by indexing each result by its ETag value and
// storing the ID of the project it belongs to as well as the
// timestamp of the request. If the events for a project are
// modified in any way (addition/deletion/change) we immediately
// invalidate all cached responses for that project, else we
// delete them when they're older than maxAge (plus a delay).
// When the user sends a request with an ETag, we search for
// the ETag in our cache and return that, if present, instead
// of hitting the database.
const cache = {};
const maxAge = 90*60*1000; // 1.5 hours
setInterval(() => {
const now = Date.now();
for (const key in cache) {
const value = cache[key];
if ((now - value.tstamp) > maxAge) {
// console.log("CLEARING", key);
delete cache[key];
}
}
}, 5*60*1000); // Run every five minutes
listen(["event"], (data) => {
for (const key in cache) {
const value = cache[key];
if (value.pid == data.payload.pid) {
delete cache[key];
}
}
});
function get (req, res, next) {
try {
// console.log(cache);
const etag = req.get('if-none-match');
// console.log("ETag", etag);
if (etag && cache[etag]) {
// console.log("In cache");
if (cache[etag].headers) {
for (const header in cache[etag].headers) {
const value = cache[etag].headers[header];
if (header && value) {
res.set(header, value);
}
}
}
// 304s have no body
// https://tools.ietf.org/html/rfc7232#section-4.1
res.status(304).send();
next('route');
} else {
// console.log("Not in cache");
next();
}
} catch (err) {
next(err);
}
}
function save (req, res, next) {
const etag = res.getHeader("etag");
if (etag) {
cache[etag] = {
headers: {
"Content-Type": res.getHeader("content-type") || "application/json"
},
pid: req.params.project,
tstamp: Date.now()
}
// console.log("CACHE", cache);
}
next();
}
module.exports = { get, save };

View File

@@ -4,26 +4,7 @@ const { event } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = Object.assign({}, req.body);
if (req.params.type && req.params.id) {
payload.type = req.params.type;
payload.id = req.params.id;
}
if (req.params.labels) {
payload.labels = req.params.labels.split(";");
}
if (!req.meta.isLabel) {
// User is requesting that we delete the whole event,
// not just labels
// FIXME NOTE Removal of labels would be best done via
// a PUT request.
delete payload.labels
}
await event.del(req.params.project, payload, req.query);
await event.del(req.params.project, req.params.id);
res.status(204).send();
next();
} catch (err) {

View File

@@ -1,29 +1,14 @@
const json = require('./json');
const geojson = require('./geojson');
const seis = require('./seis');
const html = require('./html');
const pdf = require('./pdf');
module.exports = async function (req, res, next) {
const { event } = require('../../../../lib/db');
const json = async function (req, res, next) {
try {
const handlers = {
"application/json": json,
"application/geo+json": geojson,
"application/vnd.seis+json": seis,
"text/html": html,
"application/pdf": pdf
};
const mimetype = (handlers[req.query.mime] && req.query.mime) || req.accepts(Object.keys(handlers));
if (mimetype) {
res.set("Content-Type", mimetype);
await handlers[mimetype](req, res, next);
} else {
res.status(406).send();
next();
}
const response = await event.get(req.params.project, req.params.id);
res.status(200).send(response);
next();
} catch (err) {
next(err);
}
}
};
module.exports = json;

View File

@@ -1,9 +1,10 @@
module.exports = {
list: require('./list'),
sequence: require('./sequence'),
get: require('./get'),
post: require('./post'),
put: require('./put'),
delete: require('./delete'),
cache: require('./cache')
patch: require('./patch'),
delete: require('./delete')
}

View File

@@ -4,13 +4,13 @@ const { event } = require('../../../../lib/db');
const geojson = async function (req, res, next) {
try {
const events = await event.list(req.params.project, req.query);
const response = events.filter(event => event.geometry).map(event => {
const response = events.filter(event => event.meta.geometry).map(event => {
const feature = {
type: "Feature",
geometry: event.geometry,
geometry: event.meta.geometry,
properties: event
};
delete feature.properties.geometry;
delete feature.properties.meta.geometry;
return feature;
});
res.status(200).send(response);

View File

@@ -9,9 +9,9 @@ module.exports = async function (req, res, next) {
"application/geo+json": geojson,
"application/vnd.seis+json": seis
};
const mimetype = req.accepts(Object.keys(handlers));
const mimetype = (handlers[req.query.mime] && req.query.mime) || req.accepts(Object.keys(handlers));
if (mimetype) {
res.set("Content-Type", mimetype);
await handlers[mimetype](req, res, next);

View File

@@ -0,0 +1,16 @@
const { event } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
await event.patch(req.params.project, req.params.id, payload, req.query);
res.status(201).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -6,27 +6,6 @@ module.exports = async function (req, res, next) {
try {
const payload = req.body;
if (req.params.type) {
payload.type = req.params.type;
}
if (payload.type == "timed") {
if (!payload.tstamp) {
payload.tstamp = (new Date).toISOString();
}
delete payload.sequence;
delete payload.point;
} else if (payload.type == "sequence") {
delete payload.tstamp;
}
if (req.params.tstamp) {
payload.tstamp = req.params.tstamp;
} else if (req.params.sequence && req.params.shot) {
payload.sequence = req.params.sequence;
payload.point = req.params.shot;
}
await event.post(req.params.project, payload, req.query);
res.status(201).send();
next();

View File

@@ -6,28 +6,7 @@ module.exports = async function (req, res, next) {
try {
const payload = req.body;
if (req.params.type) {
payload.type = req.params.type;
}
if (payload.type == "timed") {
if (!payload.tstamp) {
payload.tstamp = (new Date).toISOString();
}
delete payload.sequence;
delete payload.point;
} else if (payload.type == "sequence") {
delete payload.tstamp;
}
if (req.params.tstamp) {
payload.tstamp = req.params.tstamp;
} else if (req.params.sequence && req.params.shot) {
payload.sequence = req.params.sequence;
payload.point = req.params.shot;
}
await event.put(req.params.project, payload, req.query);
await event.put(req.params.project, req.params.id, payload, req.query);
res.status(201).send();
next();
} catch (err) {

View File

@@ -1,4 +1,4 @@
const { transform, prepare } = require('../../../../lib/sse');
const { transform, prepare } = require('../../../../../lib/sse');
const geojson = async function (req, res, next) {
try {

View File

@@ -1,9 +1,9 @@
const { configuration } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const render = require('../../../../lib/render');
const { configuration } = require('../../../../../lib/db');
const { transform, prepare } = require('../../../../../lib/sse');
const render = require('../../../../../lib/render');
// FIXME Refactor when able
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../etc/default/templates/sequence.html.njk");
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../../etc/default/templates/sequence.html.njk");
const html = async function (req, res, next) {
try {
@@ -13,9 +13,9 @@ const html = async function (req, res, next) {
const seis = transform(events, sequences, {projectId: req.params.project, missingAsEvent: true});
const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
// console.log("TEMPLATE", template);
const response = await render(seis, template);
if ("download" in query || "d" in query) {
const extension = "html";
// Get the sequence number(s) (more than one sequence can be selected)

View File

@@ -0,0 +1,29 @@
const json = require('./json');
const geojson = require('./geojson');
const seis = require('./seis');
const html = require('./html');
const pdf = require('./pdf');
module.exports = async function (req, res, next) {
try {
const handlers = {
"application/json": json,
"application/geo+json": geojson,
"application/vnd.seis+json": seis,
"text/html": html,
"application/pdf": pdf
};
const mimetype = (handlers[req.query.mime] && req.query.mime) || req.accepts(Object.keys(handlers));
if (mimetype) {
res.set("Content-Type", mimetype);
await handlers[mimetype](req, res, next);
} else {
res.status(406).send();
next();
}
} catch (err) {
next(err);
}
}

View File

@@ -1,4 +1,4 @@
const { transform, prepare } = require('../../../../lib/sse');
const { transform, prepare } = require('../../../../../lib/sse');
const json = async function (req, res, next) {
try {

View File

@@ -1,13 +1,13 @@
const fs = require('fs/promises');
const Path = require('path');
const crypto = require('crypto');
const { configuration } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const render = require('../../../../lib/render');
const { url2pdf } = require('../../../../lib/selenium');
const { configuration } = require('../../../../../lib/db');
const { transform, prepare } = require('../../../../../lib/sse');
const render = require('../../../../../lib/render');
const { url2pdf } = require('../../../../../lib/selenium');
// FIXME Refactor when able
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../etc/default/templates/sequence.html.njk");
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../../etc/default/templates/sequence.html.njk");
function tmpname (tmpdir="/dev/shm") {
return Path.join(tmpdir, crypto.randomBytes(16).toString('hex')+".tmp");
@@ -21,12 +21,12 @@ const pdf = async function (req, res, next) {
const {events, sequences} = await prepare(req.params.project, query);
const seis = transform(events, sequences, {projectId: req.params.project, missingAsEvent: true});
const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
const html = await render(seis, template);
await fs.writeFile(fname, html);
const pdf = Buffer.from(await url2pdf("file://"+fname), "base64");
if ("download" in query || "d" in query) {
const extension = "pdf";
// Get the sequence number(s) (more than one sequence can be selected)

View File

@@ -1,4 +1,4 @@
const { transform, prepare } = require('../../../../lib/sse');
const { transform, prepare } = require('../../../../../lib/sse');
const seis = async function (req, res, next) {
try {

View File

@@ -0,0 +1,9 @@
module.exports = {
// list: require('./list'),
get: require('./get'),
// post: require('./post'),
// put: require('./put'),
// delete: require('./delete'),
// cache: require('./cache')
}

View File

@@ -10,6 +10,7 @@ module.exports = {
label: require('./label'),
navdata: require('./navdata'),
queue: require('./queue'),
qc: require('./qc'),
configuration: require('./configuration'),
info: require('./info'),
meta: require('./meta'),

View File

@@ -4,7 +4,7 @@ const { info } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
await info.delete(req.params.project, req.params.path);
await info.delete(req.params.project, req.params.path, undefined, req.user.role);
res.status(204).send();
next();
} catch (err) {

View File

@@ -4,7 +4,7 @@ const { info } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).json(await info.get(req.params.project, req.params.path, req.query));
res.status(200).json(await info.get(req.params.project, req.params.path, req.query, req.user.role));
} catch (err) {
if (err instanceof TypeError) {
res.status(404).json(null);

View File

@@ -6,7 +6,7 @@ module.exports = async function (req, res, next) {
try {
const payload = req.body;
await info.post(req.params.project, req.params.path, payload);
await info.post(req.params.project, req.params.path, payload, undefined, req.user.role);
res.status(201).send();
next();
} catch (err) {

View File

@@ -6,7 +6,7 @@ module.exports = async function (req, res, next) {
try {
const payload = req.body;
await info.put(req.params.project, req.params.path, payload);
await info.put(req.params.project, req.params.path, payload, undefined, req.user.role);
res.status(201).send();
next();
} catch (err) {

View File

@@ -4,13 +4,13 @@ const { plan } = require('../../../../lib/db');
const json = async function (req, res, next) {
try {
const response = await plan.list(req.params.project, req.query);
if ("download" in req.query || "d" in req.query) {
const extension = "html";
const filename = `${req.params.project.toUpperCase()}-Plan.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
const transforms = (i) => {
i.lon0 = Number(((i?.geometry?.coordinates||[])[0]||[])[0]).toFixed(6)*1;
i.lat0 = Number(((i?.geometry?.coordinates||[])[0]||[])[1]).toFixed(6)*1;
@@ -22,14 +22,14 @@ const json = async function (req, res, next) {
delete i.meta;
return i;
};
const csv = new AsyncParser({transforms}, {objectMode: true});
csv.processor.on('error', (err) => { throw err; });
csv.processor.on('end', () => {
res.end();
next();
});
res.status(200);
csv.processor.pipe(res);
response.forEach(row => csv.input.push(row));

View File

@@ -20,10 +20,10 @@ const html = async function (req, res, next) {
delete feature.properties.geometry;
return feature;
});
// const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
const template = defaultTemplatePath;
const mapConfig = {
size: { width: 500, height: 500 },
layers: [
@@ -52,18 +52,18 @@ const html = async function (req, res, next) {
}
]
}
const map = leafletMap(mapConfig);
const data = {
projectId: req.params.project,
info: planInfo,
lines,
map: await map.getImageData()
}
const response = await render(data, template);
if ("download" in req.query || "d" in req.query) {
const extension = "html";
const filename = `${req.params.project.toUpperCase()}-Plan.${extension}`;

View File

@@ -13,9 +13,9 @@ module.exports = async function (req, res, next) {
"text/html": html,
"application/pdf": pdf
};
const mimetype = (handlers[req.query.mime] && req.query.mime) || req.accepts(Object.keys(handlers));
if (mimetype) {
res.set("Content-Type", mimetype);
await handlers[mimetype](req, res, next);

View File

@@ -31,8 +31,8 @@ const pdf = async function (req, res, next) {
});
// const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
const template = defaultTemplatePath;
const mapConfig = {
size: { width: 500, height: 500 },
layers: [
@@ -61,21 +61,21 @@ const pdf = async function (req, res, next) {
}
]
}
const map = leafletMap(mapConfig);
const data = {
projectId: req.params.project,
info: planInfo,
lines,
map: await map.getImageData()
}
const html = await render(data, template);
await fs.writeFile(fname, html);
const pdf = Buffer.from(await url2pdf("file://"+fname), "base64");
if ("download" in req.query || "d" in req.query) {
const extension = "pdf";
const filename = `${req.params.project.toUpperCase()}-Plan.${extension}`;

View File

@@ -0,0 +1,4 @@
module.exports = {
results: require('./results')
};

View File

@@ -0,0 +1,16 @@
const { qc } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
await qc.results.delete(req.params.project, req.params.sequence);
res.status(204).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,14 @@
const { qc } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).json(await qc.results.get(req.params.project, req.params.sequence, req.query, req.user.role));
} catch (err) {
next(err);
return;
}
next();
};

View File

@@ -0,0 +1,4 @@
module.exports = {
get: require('./get'),
delete: require('./delete')
};

View File

@@ -6,7 +6,7 @@ module.exports = async function (req, res, next) {
if (req.query.remote) {
// We're being asked to fetch a remote feed
// NOTE: No, we don't limit what feeds the user can fetch
const r = await fetch(req.query.remote);
if (r && r.ok) {
res.set("Content-Type", "application/xml");

View File

@@ -6,7 +6,7 @@ module.exports = async function (req, res, next) {
try {
const json = await sequence.get(req.params.project, req.params.sequence, req.query);
const geometry = req.query.geometry || "geometrypreplot";
const geojson = {
type: "FeatureCollection",
features: json.map(feature => {
@@ -17,7 +17,7 @@ module.exports = async function (req, res, next) {
}
})
};
res.status(200).send(geojson);
next();
} catch (err) {

View File

@@ -7,9 +7,9 @@ module.exports = async function (req, res, next) {
"application/json": json,
"application/geo+json": geojson,
};
const mimetype = (handlers[req.query.mime] && req.query.mime) || req.accepts(Object.keys(handlers));
if (mimetype) {
res.set("Content-Type", mimetype);
await handlers[mimetype](req, res, next);

View File

@@ -1,5 +1,7 @@
module.exports = {
list: require('./list'),
get: require('./get'),
patch: require('./patch')
patch: require('./patch'),
point: require('./point')
};

Some files were not shown because too many files have changed in this diff Show More