Compare commits

..

85 Commits

Author SHA1 Message Date
D. Berge
ea3e31058f Refactor the planned lines editing logic.
We move most of the logic from the client (as it was until now) to the
server.

The PATCH command maintains the same format but it should provide only
one of the following keys per request:

* ts0
* ts1
* speed
* fsp
* lsp
* lagAfter
* sequence

   Earlier keys in the list above take priority over latter ones.

The following keys may be provided by themselves or in combination with
each other (but not with any of the above):

* name
* remarks
* meta

As a special case, an empty string as the `name` value causes the name
to be auto-generated.

See comments in the code `patch.js` for details on the update logic.
2021-05-28 20:30:59 +02:00
D. Berge
534a54ef75 Add database upgrade file 05 2021-05-28 20:30:59 +02:00
D. Berge
f314536daf Change planned_lines trigger from statement to row.
Because a) it tells us what has changed and b) doesn't fire if we
didn't actually change anything.
2021-05-28 20:30:59 +02:00
D. Berge
de4aa52417 Make planned_lines primary key deferrable.
Helps when we need to renumber sequences.
2021-05-28 20:30:59 +02:00
D. Berge
758b13b189 Add saillines layer to map 2021-05-28 20:30:29 +02:00
D. Berge
967db1dec6 Include NTBA status in preplot GIS output 2021-05-28 20:29:57 +02:00
D. Berge
91fd5e4559 Ensure that timestamp is always a Date object 2021-05-27 17:50:01 +02:00
D. Berge
cf171628cd Fix error in editing of planned line start time 2021-05-27 17:49:32 +02:00
D. Berge
94c29f4723 Change the sunset / sunrise times reported via the tooltip.
The icon still uses the lower edge of the sun to calculate day / night,
but the tooltip shows actual sunrise and sunset times.
2021-05-27 02:08:30 +02:00
D. Berge
14b2e55a2e Remove edit controls from planner for read-only users.
Left over from #108.
2021-05-27 01:32:03 +02:00
D. Berge
c30e54a515 Round vessel speeds to 0.1 kt 2021-05-27 01:09:28 +02:00
D. Berge
7ead826677 Show sunrise / sunset times in the planner.
* A ‘sun’ icon is shown when a line starts and ends in daytime
* A ‘moon’ icon is shown when a line starts and ends in nighttime
* A ‘sun/moon’ icon is shown in other cases

Sunrise and sunset times are provided as a tooltip when hovering over
the icon.

Closes #72.
2021-05-27 01:02:42 +02:00
D. Berge
7aecb514db Clear QC metadata when importing gun data.
Fixes #118.
2021-05-26 00:30:58 +02:00
D. Berge
ad395aa6e4 Include the planned lines table in system dumps 2021-05-26 00:15:09 +02:00
D. Berge
523ec937dd Always merge metadata on import.
The INSERT INTO raw_lines / final_lines will not always be executed as
the lines may already exist (particularly in raw_lines because of
*online*), so whether it worked or not we merge the metadata immediately
afterwards (this may cause an extra notification to be fired).
2021-05-25 03:19:42 +02:00
D. Berge
9d2ccd75dd Do not try to use line name if there isn't one 2021-05-25 03:19:00 +02:00
D. Berge
3985a6226b Suggest ${lineName}-NavLog.${extension} as file name.
This is for the usual case where only one sequence is requested.

When more than one sequence is requested, the suggested name comes out
as ${projectId}-${sequenceList}.${extension}, where `sequenceList` is
the list of sequence numbers separated by semicolons, e.g.:
eq21203-37;38;39.html.

Closes #116.
2021-05-25 02:23:41 +02:00
D. Berge
7d354ffdb6 Add database upgrade file 2021-05-25 02:21:11 +02:00
D. Berge
3d70a460ac Output raw and final lines metadata in summary views 2021-05-25 02:13:50 +02:00
D. Berge
caae656aae Fix event detection failure.
There was a typo in the channel detection logic, resulting
in bogus events full of `undefined` data values.

Fixes #115.
2021-05-24 18:30:53 +02:00
D. Berge
5708ed1a11 Merge branch '57-make-event-log-entries-for-start-and-end-of-line-upon-import-of-final-sequence-if-the-entries-do' into 'devel'
Resolve "Make event log entries for start and end of line upon import of final sequence, if the entries do not already exist"

Closes #57

See merge request wgp/dougal/software!11
2021-05-24 15:44:58 +00:00
D. Berge
ad3998d4c6 Add database upgrade file 2021-05-24 17:41:11 +02:00
D. Berge
8638f42e6d Add database upgrade files.
These files contain the sequence of SQL commands needed to bring
a database or project schema up to date with the latest template
database or project schema.

These files must be applied manually. Check the comments at the top of
the file for instructions.
2021-05-24 17:39:01 +02:00
D. Berge
bc5aef5144 Run post-import functions after final lines.
The reason why need to do it like this instead of relying on a trigger
is because the entry in final_lines is created first and the final_shots
are populated. If we first the trigger on final_lines it is not going
to find any shots; if we fire it as a row trigger on final_shots it
would try to label every point in sequence as it is imported; finally if
we fire it as a statement trigger on final_shots we have no idea which
sequence was imported.
2021-05-24 16:59:56 +02:00
D. Berge
2b798c3ea3 Ignore attempts to put the same label twice on the same event 2021-05-24 16:59:20 +02:00
D. Berge
4d97784829 Upgrade database project schema template.
Adds:

* label_in_sequence (_sequence integer, _label text):
  Returns events containing the specified label.

* handle_final_line_events (_seq integer, _label text, _column text):
  - If _label does not exist in the events for sequence _seq:
    it adds a new _label label at the shotpoint obtained from
    final_lines_summary[_column].
  - If _label does exist (and hasn't been auto-added by this function
    in a previous run), it will add information about it to the final
    line's metadata.

* final_line_post_import (_seq integer):
  Calls handle_final_line_events() on the given sequence to check
  for FSP, FGSP, LGSP and LSP labels.

* events_seq_labels_single ():
  Trigger function to ensure that labels that have the attribute
  `model.multiple` set to `false` occur at most only once per
  sequence. If a new instance is added to a sequence, the previous
  instance is deleted.

* Trigger on events_seq_labels that calls events_seq_labels_single().

* Trigger on events_timed_labels that calls events_seq_labels_single().
2021-05-24 16:49:39 +02:00
D. Berge
13da38b4cd Make websocket notifications await.
Not sure if this helps much. It might help with avoiding
out of order notifications and reducing the rate at which
the clients get spammed when importing database dumps and
such, but that hasn't been tested.
2021-05-24 15:52:29 +02:00
D. Berge
5af89050fb Refactor SOL/EOL real-time detection handler.
This also implements a generic handler mechanism that can be
reused for other purposes, such as sending email / XMPP notifications,
doing real-time QC checks and so on.

Fixes #113.
2021-05-24 13:48:53 +02:00
D. Berge
d40ceb8343 Refactor list of notification channels into its own file 2021-05-24 13:38:19 +02:00
D. Berge
56d1279584 Allow api action to make arbitrary HTTP(S) requests.
If the URL is an absolute HTTP(S) one, we use it as-is.
2021-05-24 13:35:36 +02:00
D. Berge
d02edb4e76 Force the argument into String prior to splitting 2021-05-24 13:32:03 +02:00
D. Berge
9875ae86f3 Record P1/11 line name in database on import 2021-05-24 13:30:25 +02:00
D. Berge
53f71f7005 Set primary key on events_seq_labels in schema template 2021-05-23 22:27:00 +02:00
D. Berge
5de64e6b45 Add meta column to events view in schema template 2021-05-23 22:26:00 +02:00
D. Berge
67af85eca9 Recognise PENDING status in sequence imports.
If a final sequence file or directory name matches a pattern
which is recognised to indicate a ‘pending acceptance’ status,
the final data (if any exists) for that sequence will be deleted
and a comment added to the effect that the sequence has been
marked as ‘pending’.

To accept the sequence, rename its final file or directory name
accordingly.

Note: it is the *final* data that is searched for a matching
pattern, not the raw.

Closes #91.
2021-05-21 15:15:15 +02:00
D. Berge
779b28a331 Add info table to system dumps 2021-05-21 12:18:36 +02:00
D. Berge
b9a4d18ed9 Do not fail if no equipment has been defined.
Fixes #112.
2021-05-20 21:16:39 +02:00
D. Berge
0dc9ac2b3c Merge branch '71-add-equipment-info-to-the-logs' into 'devel'
Resolve "Add equipment info to the logs"

Closes #71

See merge request wgp/dougal/software!10
2021-05-20 19:05:35 +00:00
D. Berge
39d85a692b Use default Nunjucks template if necessary.
If the survey configuration does not itself have a template
we will use the one in etc/defaults/templates/sequence.html.njk.

It is not very likely that the template will be changed all that
often and it avoids issues when people forget to copy it across
to a new survey, etc.
2021-05-20 20:38:39 +02:00
D. Berge
e7661bfd1c Do not fail if requested object does not exist 2021-05-20 20:38:08 +02:00
D. Berge
1649de6c68 Update default sequence HTML template 2021-05-20 20:37:37 +02:00
D. Berge
1089d1fe75 Add equipment configuration fontend user interface 2021-05-20 18:35:56 +02:00
D. Berge
fc58a4d435 Implement equipment frontend component 2021-05-20 18:35:56 +02:00
D. Berge
c832d8b107 Commit default template for sequences 2021-05-20 18:35:56 +02:00
D. Berge
4a9e61be78 Add unique filter to Nunjucks renderer 2021-05-20 18:35:56 +02:00
D. Berge
8cfd1a7fc9 Export equipment info to Seis+JSON files 2021-05-20 18:35:56 +02:00
D. Berge
315733eec0 Refactor events export middleware.
Uses the `prepare` method for better reusability.
2021-05-20 18:35:56 +02:00
D. Berge
ad422abe94 Add prepare method for Seis+JSON and related exports.
It retrieves the data necessary for a complete Seis+JSON
export, including equipment info.
2021-05-20 18:35:56 +02:00
D. Berge
92210378e1 Listen for and broadcast info notifications 2021-05-20 18:21:01 +02:00
D. Berge
8d3e665206 Expose new API endpoint: /info/:path(*).
Provides CRUD access to values (which may be deeply nested) from the
global `info` table.
2021-05-20 18:19:29 +02:00
D. Berge
4ee65ef284 Implement info/delete middleware 2021-05-20 18:18:26 +02:00
D. Berge
d048a19066 Implement info/put middleware 2021-05-20 18:18:13 +02:00
D. Berge
97ed9bcce4 Implement info/post middleware 2021-05-20 18:17:52 +02:00
D. Berge
316117cb83 Implement info.delete() database method.
It deletes a (possibly deeply nested) element in the
`info` table.
2021-05-20 18:16:26 +02:00
D. Berge
1d38f6526b Implement info.put() database method.
Replaces an existing element with a new one, or inserts it
if there is nothing to replace. The element may be deeply
nested inside a JSON object or array in the `info` table.

Works for both public.info and survey_?.info.
2021-05-20 18:14:43 +02:00
D. Berge
6feb7d49ee Implement info.post() database method.
It adds an element to a JSON array corresponding to a
key in the info table. Errors out if the value is not
an array.
2021-05-20 18:13:15 +02:00
D. Berge
ac51f72180 Ignore empty path parts in info.get() 2021-05-20 18:10:51 +02:00
D. Berge
86d3323869 Remove logging statement 2021-05-20 18:10:27 +02:00
D. Berge
b181e4f424 Let the user set the search path to no survey.
This is so that we can access tables in the `public`
schema which are overloaded by survey tables, as is
the case with `info`.
2021-05-20 18:08:03 +02:00
D. Berge
7917eeeb0b Add table info to schema.
This one is independent of any projects so it goes
into `public`.
2021-05-20 18:07:05 +02:00
D. Berge
b18907fb05 Merge branch '53-mark-points-as-not-to-be-acquired-ntba' into 'devel'
Resolve "Mark points as ‘not to be acquired’ (NTBA)"

Closes #53

See merge request wgp/dougal/software!9
2021-05-17 18:34:46 +00:00
D. Berge
3e1861fcf6 Update API description 2021-05-17 20:30:59 +02:00
D. Berge
820b0c2b91 Add set line complete / incomplete actions.
The following options are shown:

* Set line complete:

If a line has been partially shot and still has points
to be acquired.

This option marks remaining virgin points as NTBA=true.

* Set line incomplete:

If a line has been partially shot and remaining virgin
points have been marked as NTBA.

This option marks all points in the line as NTBA=false.

* Set line NTBA:

If a line has not been (successfully) shot at all, i.e.,
all points on the line are virgin.

This option marks the line itself as NTBA=true.

* Unset line NTBA:

If a line has been marked as NTBA.

This option clears the NTBA flag from the line.
2021-05-17 20:19:53 +02:00
D. Berge
57f4834da8 Add information about virgin and remaining points 2021-05-17 20:19:16 +02:00
D. Berge
08d33e293a React also on preplot point changes, not just lines 2021-05-17 20:18:33 +02:00
D. Berge
8e71b18225 Add complete to line PATCH options.
`complete` is a boolean.

If true, any virgin points remaining on the line
will be marked as `ntba=true`.

If false, *all* points on the line will be marked
as `ntba=false`.
2021-05-17 20:15:34 +02:00
D. Berge
f297458954 Report on virgin points and points to be acquired.
Virgin points are those that have not been acquired
(and processed) at least once.

Points to be acquired are virgin points that do not
have the `ntba` flag set.
2021-05-17 20:13:53 +02:00
D. Berge
eb28648e57 Remove bogus dependency 2021-05-17 17:18:35 +02:00
D. Berge
0c352512b0 Enable the ‘view on map’ log action item. 2021-05-17 17:14:58 +02:00
D. Berge
4d87506720 Show a map marker if position given in URL hash.
If the location URL contains a hash of either:

* #z/x/y
* #x/y

In the first case it will zoom and pan to the location;
in the second case it will only pan while maintaining the
current (or last used) zoom level.

If the location URL does not contain a hash in one of those
formats, the marker will be removed from the map.
2021-05-17 17:14:35 +02:00
D. Berge
20bce40dac Upgrade Vue components 2021-05-17 14:22:26 +02:00
D. Berge
cf79cf86ae Fix ‘this is undefined’ error 2021-05-16 21:38:31 +02:00
D. Berge
8e4f62e5be Reset snack message when hiding.
This is so that the same message will cause the snack
to be shown again.
2021-05-16 19:58:36 +02:00
D. Berge
a8850e5d0c Protect the /project/:project/meta route 2021-05-16 19:58:03 +02:00
D. Berge
b5a762b5e3 Merge branch '108-remove-edit-controls-for-read-only-users' into 'devel'
Resolve "Remove edit controls for read-only users"

Closes #108

See merge request wgp/dougal/software!8
2021-05-16 17:56:35 +00:00
D. Berge
418f1a00b8 Hide edit controls from ready-only users 2021-05-16 19:55:31 +02:00
D. Berge
0d9f7ac4ec Add privilege level getters to Vuex.
* writeaccess: true if user can change data.
* adminaccess: true if user is an administrator.
2021-05-16 19:53:24 +02:00
D. Berge
76c9c3ef2a Assign (some) offline navdata to a survey.
There is no concept of ‘current survey’ in Dougal, and
assigning navigation data to a particular survey is full
of edge cases but sometimes it is necessary or at least
convenient to do so.

This commit implements once such strategy, which consists
of checking the distance to the preplots of all active
surveys (well, those that do have preplots anyway) and
picking the nearest one.

To reduce load, we only do this every once in a while as
governed by the `offline_survey_detect_interval` option
in the configuration.

This strategy is only active if the configuration option
`offline_survey_heuristics == "nearest_preplot"` for the
corresponding navigation header.
2021-05-16 03:16:19 +02:00
D. Berge
ef798860cd Add collect filter to template renderer.
This filter can collect attributes from items having the
same key into a single item.

Can be used in templates like this:

{% for Entry in Sequence.Entries |
   collect("ShotPointId", ["EntryType", "Comment"]) %}

to avoid duplicating shotpoint numbers.
2021-05-15 20:07:02 +02:00
D. Berge
e57c362d94 Fix error with timestamp filter (again) 2021-05-15 20:06:36 +02:00
D. Berge
7605b11fdb Fix error with timestamp Nunjucks filter 2021-05-15 18:59:47 +02:00
D. Berge
84e791fc66 Add more sequence information to SeisJSON file 2021-05-15 18:37:32 +02:00
D. Berge
3e2126cc32 Add option to download reports from sequence list.
The context menu includes options to download the sequence
report in different formats.
2021-05-15 17:12:41 +02:00
D. Berge
b0f4559b83 Allow direct downloading of sequence reports.
If the `download` or `d` query parameter is supplied (even
without any value), the response will include a
`Content-Disposition: attachment` header. A filename will
also be suggested.
2021-05-15 17:10:28 +02:00
D. Berge
c7e2e18cc8 Merge branch '84-produce-human-readable-versions-of-json-structured-sequence-data-exports-sse' into 'devel'
Resolve "Produce human-readable versions of JSON structured sequence data exports (SSE)"

Closes #84

See merge request wgp/dougal/software!7
2021-05-15 13:07:07 +00:00
64 changed files with 3174 additions and 583 deletions

View File

@@ -406,12 +406,20 @@ class Datastore:
self.del_hash("*online*", cursor)
qry = """
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr)
VALUES (%s, %s, '', %s, %s)
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr, meta)
VALUES (%s, %s, '', %s, %s, %s)
ON CONFLICT DO NOTHING;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr, json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO raw_lines_files (sequence, hash)
@@ -448,12 +456,20 @@ class Datastore:
hash = self.add_file(filepath, cursor)
qry = """
INSERT INTO final_lines (sequence, line, remarks)
VALUES (%s, %s, '')
INSERT INTO final_lines (sequence, line, remarks, meta)
VALUES (%s, %s, '', %s)
ON CONFLICT DO NOTHING;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"]))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO final_lines_files (sequence, hash)
@@ -479,6 +495,8 @@ class Datastore:
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
cursor.execute("CALL final_line_post_import(%s);", (fileinfo["sequence"],))
self.maybe_commit()
@@ -514,7 +532,7 @@ class Datastore:
qry = """
UPDATE raw_shots
SET meta = jsonb_set(meta, '{smsrc}', %s::jsonb, true)
SET meta = jsonb_set(meta, '{smsrc}', %s::jsonb, true) - 'qc'
WHERE sequence = %s AND point = %s;
"""
@@ -639,3 +657,21 @@ class Datastore:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def del_sequence_final(self, sequence, cursor = None):
"""
Remove final data for a sequence.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "DELETE FROM files WHERE hash = (SELECT hash FROM final_lines_files WHERE sequence = %s);"
cur.execute(qry, (sequence,))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction

View File

@@ -17,6 +17,35 @@ import configuration
import p111
from datastore import Datastore
def add_pending_remark(db, sequence):
text = '<!-- @@DGL:PENDING@@ --><h4 style="color:red;cursor:help;" title="Edit the sequence file or directory name to import final data">Marked as <code>PENDING</code>.</h4><!-- @@/DGL:PENDING@@ -->\n'
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
remarks = cursor.fetchone()[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is None:
remarks = text + remarks
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
def del_pending_remark(db, sequence):
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
remarks = cursor.fetchone()[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is not None:
remarks = rx.sub("",remarks)
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
if __name__ == '__main__':
print("Reading configuration")
@@ -42,6 +71,9 @@ if __name__ == '__main__':
pattern = final_p111["pattern"]
rx = re.compile(pattern["regex"])
if "pending" in survey["final"]:
pendingRx = re.compile(survey["final"]["pending"]["pattern"]["regex"])
for fileprefix in final_p111["paths"]:
print(f"Path prefix: {fileprefix}")
@@ -50,6 +82,10 @@ if __name__ == '__main__':
filepath = str(filepath)
print(f"Found {filepath}")
pending = False
if pendingRx:
pending = pendingRx.search(filepath) is not None
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
@@ -67,16 +103,30 @@ if __name__ == '__main__':
continue
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
if pending:
print("Skipping / removing final file because marked as PENDING", filepath)
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
continue
else:
del_pending_remark(db, file_info["sequence"])
p111_data = p111.from_file(filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_final_p111(p111_records, file_info, filepath, survey["epsg"])
else:
print("Already in DB")
if pending:
print("Removing from database because marked as PENDING")
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
print("Done")

View File

@@ -75,12 +75,14 @@ if __name__ == '__main__':
continue
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
p111_data = p111.from_file(filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
else:

View File

@@ -153,6 +153,9 @@ def parse_line (string):
return None
def line_name(records):
return set([ r['Acquisition Line Name'] for r in p111_type("S", records) ]).pop()
def p111_type(type, records):
return [ r for r in records if r["type"] == type ]

View File

@@ -24,6 +24,7 @@ locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
@@ -32,7 +33,8 @@ exportables = {
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ]
"raw_shots": [ "meta" ],
"planned_lines": None
}
}

View File

@@ -40,6 +40,10 @@ if __name__ == '__main__':
continue
try:
for table in exportables:
path = os.path.join(pathPrefix, table)
if os.path.exists(path):
cursor.execute(f"DELETE FROM {table};")
for table in exportables:
path = os.path.join(pathPrefix, table)
print("", path, "", table)

View File

@@ -19,6 +19,7 @@ locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
@@ -27,7 +28,8 @@ exportables = {
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ]
"raw_shots": [ "meta" ],
"planned_lines": None
}
}

View File

@@ -21,6 +21,10 @@ navigation:
# Anything here gets passed as options to the packet
# saving routine.
epsg: 23031 # Assume this CRS for unqualified E/N data
# Heuristics to apply to detect survey when offline
offline_survey_heuristics: "nearest_preplot"
# Apply the heuristics at most once every…
offline_survey_detect_interval: 10000 # ms
imports:

View File

@@ -226,6 +226,18 @@ CREATE TABLE public.real_time_inputs (
ALTER TABLE public.real_time_inputs OWNER TO postgres;
--
-- Name: info; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.info (
key text NOT NULL,
value jsonb
);
ALTER TABLE public.info OWNER TO postgres;
--
-- Name: projects projects_name_key; Type: CONSTRAINT; Schema: public; Owner: postgres
--
@@ -250,6 +262,16 @@ ALTER TABLE ONLY public.projects
ADD CONSTRAINT projects_schema_key UNIQUE (schema);
--
-- Name: info info_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.info
ADD CONSTRAINT info_pkey PRIMARY KEY (key);
--
-- Name: tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
--
@@ -271,6 +293,13 @@ CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects F
CREATE TRIGGER real_time_inputs_tg AFTER INSERT ON public.real_time_inputs FOR EACH ROW EXECUTE FUNCTION public.notify('realtime');
--
-- Name: info info_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
-- PostgreSQL database dump complete
--

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 12.4
-- Dumped by pg_dump version 12.4
-- Dumped from database version 12.6
-- Dumped by pg_dump version 12.7
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -136,6 +136,38 @@ $$;
ALTER FUNCTION _SURVEY__TEMPLATE_.clear_shot_qc() OWNER TO postgres;
--
-- Name: events_seq_labels_single(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.events_seq_labels_single() RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE _sequence integer;
BEGIN
IF EXISTS(SELECT 1 FROM labels WHERE name = NEW.label AND (data->'model'->'multiple')::boolean IS FALSE) THEN
SELECT sequence INTO _sequence FROM events WHERE id = NEW.id;
DELETE
FROM events_seq_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_seq WHERE sequence = _sequence);
DELETE
FROM events_timed_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_timed_seq WHERE sequence = _sequence);
END IF;
RETURN NULL;
END;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.events_seq_labels_single() OWNER TO postgres;
--
-- Name: events_timed_seq_match(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -213,82 +245,102 @@ $$;
ALTER PROCEDURE _SURVEY__TEMPLATE_.events_timed_seq_update_all() OWNER TO postgres;
--
-- Name: reset_events_serials(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
-- Name: final_line_post_import(integer); Type: PROCEDURE; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.reset_events_serials() RETURNS void
CREATE PROCEDURE _SURVEY__TEMPLATE_.final_line_post_import(_seq integer)
LANGUAGE plpgsql
AS $$
BEGIN
PERFORM setval('events_timed_id_seq', (SELECT max(id)+1 FROM events_timed));
PERFORM setval('events_seq_id_seq', (SELECT max(id)+1 FROM events_seq));
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.reset_events_serials() OWNER TO postgres;
ALTER PROCEDURE _SURVEY__TEMPLATE_.final_line_post_import(_seq integer) OWNER TO postgres;
--
-- Name: to_binning_grid(public.geometry); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
-- Name: handle_final_line_events(integer, text, text); Type: PROCEDURE; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry) RETURNS public.geometry
LANGUAGE plpgsql STABLE LEAKPROOF
AS $$DECLARE
bp jsonb := binning_parameters();
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
CREATE PROCEDURE _SURVEY__TEMPLATE_.handle_final_line_events(_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $$
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event events%ROWTYPE;
event_id integer;
BEGIN
-- RAISE NOTICE 'Matrix: a: %, b: %, c: %, d: %, xoff: %, yoff: %', a, b, c, d, xoff, yoff;
RETURN ST_SetSRID(ST_Affine(ST_Translate(geom, -E0, -N0), a, b, c, d, xoff, yoff), 0);
END
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
SELECT id INTO event_id FROM events_seq WHERE sequence = _seq AND point = _column_value ORDER BY id LIMIT 1;
IF event_id IS NULL THEN
--RAISE NOTICE ' but there is no existing event so we create a new one for sequence % and point %', _line.sequence, _column_value;
INSERT INTO events_seq (sequence, point, remarks)
VALUES (_line.sequence, _column_value, format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)))
RETURNING id INTO event_id;
--RAISE NOTICE 'Created event_id %', event_id;
END IF;
--RAISE NOTICE 'Remove any other auto-inserted % labels in sequence %', _label, _seq;
DELETE FROM events_seq_labels
WHERE label = _label AND id = (SELECT id FROM events_seq WHERE sequence = _seq AND meta->'auto' ? _label);
--RAISE NOTICE 'We now add a label to the event (id, label) = (%, %)', event_id, _label;
INSERT INTO events_seq_labels (id, label) VALUES (event_id, _label) ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
--RAISE NOTICE 'And also clear the %: % flag from meta.auto for any existing events for sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE meta->'auto' ? _label AND sequence = _seq AND id <> event_id;
--RAISE NOTICE 'Finally, flag the event as having been had label % auto-created by %', _label, _tg_name;
UPDATE events_seq
SET meta = jsonb_set(jsonb_set(meta, '{auto}', COALESCE(meta->'auto', '{}')), ARRAY['auto', _label], to_jsonb(_tg_name))
WHERE id = event_id;
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
--RAISE NOTICE 'Clearing the %: % flag from meta.auto for any existing events in sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE sequence = _seq AND meta->'auto'->>_label = _tg_name;
END IF;
END IF;
END;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry) OWNER TO postgres;
--
-- Name: to_binning_grid(public.geometry, jsonb); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry, bp jsonb) RETURNS public.geometry
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE
AS $$DECLARE
-- bp jsonb := binning_parameters();
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
BEGIN
-- RAISE NOTICE 'Matrix: a: %, b: %, c: %, d: %, xoff: %, yoff: %', a, b, c, d, xoff, yoff;
RETURN ST_SetSRID(ST_Affine(ST_Translate(geom, -E0, -N0), a, b, c, d, xoff, yoff), 0);
END
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry, bp jsonb) OWNER TO postgres;
ALTER PROCEDURE _SURVEY__TEMPLATE_.handle_final_line_events(_seq integer, _label text, _column text) OWNER TO postgres;
SET default_tablespace = '';
@@ -430,6 +482,7 @@ CREATE VIEW _SURVEY__TEMPLATE_.events_seq_timed AS
rs.objref,
rs.tstamp,
rs.hash,
s.meta,
rs.geometry
FROM (_SURVEY__TEMPLATE_.events_seq s
LEFT JOIN _SURVEY__TEMPLATE_.raw_shots rs USING (sequence, point));
@@ -524,6 +577,7 @@ CREATE VIEW _SURVEY__TEMPLATE_.events AS
s.objref,
s.tstamp,
s.hash,
s.meta,
(public.st_asgeojson(public.st_transform(s.geometry, 4326)))::jsonb AS geometry,
ARRAY( SELECT esl.label
FROM _SURVEY__TEMPLATE_.events_seq_labels esl
@@ -540,6 +594,7 @@ UNION
rs.objref,
t.tstamp,
rs.hash,
t.meta,
(t.meta -> 'geometry'::text) AS geometry,
ARRAY( SELECT etl.label
FROM _SURVEY__TEMPLATE_.events_timed_labels etl
@@ -558,6 +613,7 @@ UNION
v1.objref,
v1.tstamp,
v1.hash,
'{}'::jsonb AS meta,
(public.st_asgeojson(public.st_transform(v1.geometry, 4326)))::jsonb AS geometry,
ARRAY[v1.label] AS labels
FROM _SURVEY__TEMPLATE_.events_midnight_shot v1
@@ -572,6 +628,7 @@ UNION
rs.objref,
rs.tstamp,
rs.hash,
'{}'::jsonb AS meta,
(public.st_asgeojson(public.st_transform(rs.geometry, 4326)))::jsonb AS geometry,
('{QC}'::text[] || qc.labels) AS labels
FROM (_SURVEY__TEMPLATE_.raw_shots rs
@@ -582,6 +639,97 @@ UNION
ALTER TABLE _SURVEY__TEMPLATE_.events OWNER TO postgres;
--
-- Name: label_in_sequence(integer, text); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.label_in_sequence(_sequence integer, _label text) RETURNS _SURVEY__TEMPLATE_.events
LANGUAGE sql
AS $$
SELECT * FROM events WHERE sequence = _sequence AND _label = ANY(labels);
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.label_in_sequence(_sequence integer, _label text) OWNER TO postgres;
--
-- Name: reset_events_serials(); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.reset_events_serials() RETURNS void
LANGUAGE plpgsql
AS $$
BEGIN
PERFORM setval('events_timed_id_seq', (SELECT max(id)+1 FROM events_timed));
PERFORM setval('events_seq_id_seq', (SELECT max(id)+1 FROM events_seq));
END;
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.reset_events_serials() OWNER TO postgres;
--
-- Name: to_binning_grid(public.geometry); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry) RETURNS public.geometry
LANGUAGE plpgsql STABLE LEAKPROOF
AS $$DECLARE
bp jsonb := binning_parameters();
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
BEGIN
-- RAISE NOTICE 'Matrix: a: %, b: %, c: %, d: %, xoff: %, yoff: %', a, b, c, d, xoff, yoff;
RETURN ST_SetSRID(ST_Affine(ST_Translate(geom, -E0, -N0), a, b, c, d, xoff, yoff), 0);
END
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry) OWNER TO postgres;
--
-- Name: to_binning_grid(public.geometry, jsonb); Type: FUNCTION; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry, bp jsonb) RETURNS public.geometry
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE
AS $$DECLARE
-- bp jsonb := binning_parameters();
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
BEGIN
-- RAISE NOTICE 'Matrix: a: %, b: %, c: %, d: %, xoff: %, yoff: %', a, b, c, d, xoff, yoff;
RETURN ST_SetSRID(ST_Affine(ST_Translate(geom, -E0, -N0), a, b, c, d, xoff, yoff), 0);
END
$$;
ALTER FUNCTION _SURVEY__TEMPLATE_.to_binning_grid(geom public.geometry, bp jsonb) OWNER TO postgres;
--
-- Name: events_labels; Type: VIEW; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -824,7 +972,8 @@ CREATE VIEW _SURVEY__TEMPLATE_.final_lines_summary AS
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
s.length,
s.azimuth,
fl.remarks
fl.remarks,
fl.meta
FROM (summary s
JOIN _SURVEY__TEMPLATE_.final_lines fl USING (sequence));
@@ -1384,7 +1533,8 @@ CREATE VIEW _SURVEY__TEMPLATE_.raw_lines_summary AS
s.length,
s.azimuth,
rl.remarks,
rl.ntbp
rl.ntbp,
rl.meta
FROM (summary s
JOIN _SURVEY__TEMPLATE_.raw_lines rl USING (sequence));
@@ -1530,6 +1680,8 @@ CREATE VIEW _SURVEY__TEMPLATE_.sequences_summary AS
COALESCE(fls.azimuth, rls.azimuth) AS azimuth,
rls.remarks,
fls.remarks AS remarks_final,
rls.meta,
fls.meta AS meta_final,
CASE
WHEN (rls.ntbp IS TRUE) THEN 'ntbp'::text
WHEN (fls.sequence IS NULL) THEN 'raw'::text
@@ -1555,6 +1707,14 @@ ALTER TABLE ONLY _SURVEY__TEMPLATE_.events_seq ALTER COLUMN id SET DEFAULT nextv
ALTER TABLE ONLY _SURVEY__TEMPLATE_.events_timed ALTER COLUMN id SET DEFAULT nextval('_SURVEY__TEMPLATE_.events_timed_id_seq'::regclass);
--
-- Name: events_seq_labels events_seq_labels_pkey; Type: CONSTRAINT; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
ALTER TABLE ONLY _SURVEY__TEMPLATE_.events_seq_labels
ADD CONSTRAINT events_seq_labels_pkey PRIMARY KEY (id, label);
--
-- Name: events_seq events_seq_pkey; Type: CONSTRAINT; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -1656,7 +1816,7 @@ ALTER TABLE ONLY _SURVEY__TEMPLATE_.planned_lines
--
ALTER TABLE ONLY _SURVEY__TEMPLATE_.planned_lines
ADD CONSTRAINT planned_lines_pkey PRIMARY KEY (sequence);
ADD CONSTRAINT planned_lines_pkey PRIMARY KEY (sequence) DEFERRABLE;
--
@@ -1713,6 +1873,20 @@ CREATE INDEX events_seq_sequence_idx ON _SURVEY__TEMPLATE_.events_seq USING btre
CREATE INDEX events_timed_ts0_idx ON _SURVEY__TEMPLATE_.events_timed USING btree (tstamp);
--
-- Name: events_seq_labels events_seq_labels_single_tg; Type: TRIGGER; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON _SURVEY__TEMPLATE_.events_seq_labels FOR EACH ROW EXECUTE FUNCTION _SURVEY__TEMPLATE_.events_seq_labels_single();
--
-- Name: events_timed_labels events_seq_labels_single_tg; Type: TRIGGER; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE TRIGGER events_timed_labels_single_tg AFTER INSERT OR UPDATE ON _SURVEY__TEMPLATE_.events_timed_labels FOR EACH ROW EXECUTE FUNCTION _SURVEY__TEMPLATE_.events_seq_labels_single();
--
-- Name: events_seq events_tg; Type: TRIGGER; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
@@ -1766,7 +1940,7 @@ CREATE TRIGGER final_shots_tg AFTER INSERT OR DELETE OR UPDATE ON _SURVEY__TEMPL
-- Name: planned_lines planned_lines_tg; Type: TRIGGER; Schema: _SURVEY__TEMPLATE_; Owner: postgres
--
CREATE TRIGGER planned_lines_tg AFTER INSERT OR DELETE OR UPDATE ON _SURVEY__TEMPLATE_.planned_lines FOR EACH STATEMENT EXECUTE FUNCTION public.notify('planned_lines');
CREATE TRIGGER planned_lines_tg AFTER INSERT OR DELETE OR UPDATE ON _SURVEY__TEMPLATE_.planned_lines FOR EACH ROW EXECUTE FUNCTION public.notify('planned_lines');
--

View File

@@ -0,0 +1,22 @@
-- Upgrade the database from commit 78adb2be to 7917eeeb.
--
-- This upgrade affects the `public` schema only.
--
-- It creates a new table, `info`, for storing arbitrary JSON
-- data not belonging to a specific project. Currently used
-- for the equipment list, it could also serve to store user
-- details, configuration settings, system state, etc.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It will fail harmlessly if applied twice.
CREATE TABLE IF NOT EXISTS public.info (
key text NOT NULL primary key,
value jsonb
);
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');

View File

@@ -0,0 +1,160 @@
-- Upgrade the database from commit 6e7ba82e to 53f71f70.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This merges two changes to the database.
-- The first one (commit 5de64e6b) modifies the `event` view to return
-- the `meta` column of timed and sequence events.
-- The second one (commit 53f71f70) adds a primary key constraint to
-- events_seq_labels (there is already an equivalent constraint on
-- events_seq_timed).
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
DROP VIEW events_seq_timed CASCADE; -- Brings down events too
ALTER TABLE ONLY events_seq_labels
ADD CONSTRAINT events_seq_labels_pkey PRIMARY KEY (id, label);
CREATE OR REPLACE VIEW events_seq_timed AS
SELECT s.sequence,
s.point,
s.id,
s.remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
s.meta,
rs.geometry
FROM (events_seq s
LEFT JOIN raw_shots rs USING (sequence, point));
CREATE OR REPLACE VIEW events AS
WITH qc AS (
SELECT rs.sequence,
rs.point,
ARRAY[jsonb_array_elements_text(q.labels)] AS labels
FROM raw_shots rs,
LATERAL jsonb_path_query(rs.meta, '$."qc".*."labels"'::jsonpath) q(labels)
)
SELECT 'sequence'::text AS type,
false AS virtual,
s.sequence,
s.point,
s.id,
s.remarks,
s.line,
s.objref,
s.tstamp,
s.hash,
s.meta,
(public.st_asgeojson(public.st_transform(s.geometry, 4326)))::jsonb AS geometry,
ARRAY( SELECT esl.label
FROM events_seq_labels esl
WHERE (esl.id = s.id)) AS labels
FROM events_seq_timed s
UNION
SELECT 'timed'::text AS type,
false AS virtual,
rs.sequence,
rs.point,
t.id,
t.remarks,
rs.line,
rs.objref,
t.tstamp,
rs.hash,
t.meta,
(t.meta -> 'geometry'::text) AS geometry,
ARRAY( SELECT etl.label
FROM events_timed_labels etl
WHERE (etl.id = t.id)) AS labels
FROM ((events_timed t
LEFT JOIN events_timed_seq ts USING (id))
LEFT JOIN raw_shots rs USING (sequence, point))
UNION
SELECT 'midnight shot'::text AS type,
true AS virtual,
v1.sequence,
v1.point,
((v1.sequence * 100000) + v1.point) AS id,
''::text AS remarks,
v1.line,
v1.objref,
v1.tstamp,
v1.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(v1.geometry, 4326)))::jsonb AS geometry,
ARRAY[v1.label] AS labels
FROM events_midnight_shot v1
UNION
SELECT 'qc'::text AS type,
true AS virtual,
rs.sequence,
rs.point,
((10000000 + (rs.sequence * 100000)) + rs.point) AS id,
(q.remarks)::text AS remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(rs.geometry, 4326)))::jsonb AS geometry,
('{QC}'::text[] || qc.labels) AS labels
FROM (raw_shots rs
LEFT JOIN qc USING (sequence, point)),
LATERAL jsonb_path_query(rs.meta, '$."qc".*."results"'::jsonpath) q(remarks)
WHERE (rs.meta ? 'qc'::text);
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,171 @@
-- Upgrade the database from commit 53f71f70 to 4d977848.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds:
--
-- * label_in_sequence (_sequence integer, _label text):
-- Returns events containing the specified label.
--
-- * handle_final_line_events (_seq integer, _label text, _column text):
-- - If _label does not exist in the events for sequence _seq:
-- it adds a new _label label at the shotpoint obtained from
-- final_lines_summary[_column].
-- - If _label does exist (and hasn't been auto-added by this function
-- in a previous run), it will add information about it to the final
-- line's metadata.
--
-- * final_line_post_import (_seq integer):
-- Calls handle_final_line_events() on the given sequence to check
-- for FSP, FGSP, LGSP and LSP labels.
--
-- * events_seq_labels_single ():
-- Trigger function to ensure that labels that have the attribute
-- `model.multiple` set to `false` occur at most only once per
-- sequence. If a new instance is added to a sequence, the previous
-- instance is deleted.
--
-- * Trigger on events_seq_labels that calls events_seq_labels_single().
--
-- * Trigger on events_timed_labels that calls events_seq_labels_single().
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS events
LANGUAGE sql
AS $$
SELECT * FROM events WHERE sequence = _sequence AND _label = ANY(labels);
$$;
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event events%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
SELECT id INTO event_id FROM events_seq WHERE sequence = _seq AND point = _column_value ORDER BY id LIMIT 1;
IF event_id IS NULL THEN
--RAISE NOTICE '… but there is no existing event so we create a new one for sequence % and point %', _line.sequence, _column_value;
INSERT INTO events_seq (sequence, point, remarks)
VALUES (_line.sequence, _column_value, format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)))
RETURNING id INTO event_id;
--RAISE NOTICE 'Created event_id %', event_id;
END IF;
--RAISE NOTICE 'Remove any other auto-inserted % labels in sequence %', _label, _seq;
DELETE FROM events_seq_labels
WHERE label = _label AND id = (SELECT id FROM events_seq WHERE sequence = _seq AND meta->'auto' ? _label);
--RAISE NOTICE 'We now add a label to the event (id, label) = (%, %)', event_id, _label;
INSERT INTO events_seq_labels (id, label) VALUES (event_id, _label) ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
--RAISE NOTICE 'And also clear the %: % flag from meta.auto for any existing events for sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE meta->'auto' ? _label AND sequence = _seq AND id <> event_id;
--RAISE NOTICE 'Finally, flag the event as having been had label % auto-created by %', _label, _tg_name;
UPDATE events_seq
SET meta = jsonb_set(jsonb_set(meta, '{auto}', COALESCE(meta->'auto', '{}')), ARRAY['auto', _label], to_jsonb(_tg_name))
WHERE id = event_id;
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
--RAISE NOTICE 'Clearing the %: % flag from meta.auto for any existing events in sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE sequence = _seq AND meta->'auto'->>_label = _tg_name;
END IF;
END IF;
END;
$$;
CREATE OR REPLACE PROCEDURE final_line_post_import (_seq integer)
LANGUAGE plpgsql
AS $$
BEGIN
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$$;
CREATE OR REPLACE FUNCTION events_seq_labels_single ()
RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE _sequence integer;
BEGIN
IF EXISTS(SELECT 1 FROM labels WHERE name = NEW.label AND (data->'model'->'multiple')::boolean IS FALSE) THEN
SELECT sequence INTO _sequence FROM events WHERE id = NEW.id;
DELETE
FROM events_seq_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_seq WHERE sequence = _sequence);
DELETE
FROM events_timed_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_timed_seq WHERE sequence = _sequence);
END IF;
RETURN NULL;
END;
$$;
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_seq_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_timed_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,94 @@
-- Upgrade the database from commit 4d977848 to 3d70a460.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds the `meta` column to the output of the following views:
--
-- * raw_lines_summary; and
-- * sequences_summary
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = rl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_preplots) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
DROP VIEW sequences_summary;
CREATE OR REPLACE VIEW sequences_summary AS
SELECT rls.sequence,
rls.line,
rls.fsp,
rls.lsp,
fls.fsp AS fsp_final,
fls.lsp AS lsp_final,
rls.ts0,
rls.ts1,
fls.ts0 AS ts0_final,
fls.ts1 AS ts1_final,
rls.duration,
fls.duration AS duration_final,
rls.num_preplots,
COALESCE(fls.num_points, rls.num_points) AS num_points,
COALESCE(fls.missing_shots, rls.missing_shots) AS missing_shots,
COALESCE(fls.length, rls.length) AS length,
COALESCE(fls.azimuth, rls.azimuth) AS azimuth,
rls.remarks,
fls.remarks AS remarks_final,
rls.meta,
fls.meta AS meta_final,
CASE
WHEN (rls.ntbp IS TRUE) THEN 'ntbp'::text
WHEN (fls.sequence IS NULL) THEN 'raw'::text
ELSE 'final'::text
END AS status
FROM (raw_lines_summary rls
LEFT JOIN final_lines_summary fls USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,33 @@
-- Upgrade the database from commit 3d70a460 to 0983abac.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This:
--
-- * makes the primary key on planned_lines deferrable; and
-- * changes the planned_lines trigger from statement to row.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
ALTER TABLE planned_lines DROP CONSTRAINT planned_lines_pkey;
ALTER TABLE planned_lines ADD CONSTRAINT planned_lines_pkey PRIMARY KEY (sequence) DEFERRABLE;
DROP TRIGGER planned_lines_tg ON planned_lines;
CREATE TRIGGER planned_lines_tg AFTER INSERT OR DELETE OR UPDATE ON planned_lines FOR EACH ROW EXECUTE FUNCTION public.notify('planned_lines');
--
--NOTE Run `COMMIT;` now if all went well
--

File diff suppressed because one or more lines are too long

View File

@@ -17,12 +17,13 @@
"leaflet-realtime": "^2.2.0",
"leaflet.markercluster": "^1.4.1",
"marked": "^2.0.3",
"suncalc": "^1.8.0",
"typeface-roboto": "0.0.75",
"vue": "^2.6.12",
"vue-debounce": "^2.5.7",
"vue-router": "^3.4.5",
"vuetify": "^2.4.11",
"vuex": "^3.5.1"
"vue-debounce": "^2.6.0",
"vue-router": "^3.5.1",
"vuetify": "^2.5.0",
"vuex": "^3.6.2"
},
"devDependencies": {
"@vue/cli-plugin-babel": "~4.4.0",
@@ -10971,6 +10972,11 @@
"node": ">= 8"
}
},
"node_modules/suncalc": {
"version": "1.8.0",
"resolved": "https://registry.npmjs.org/suncalc/-/suncalc-1.8.0.tgz",
"integrity": "sha1-HZiYEJVjB4dQ9JlKlZ5lTYdqy/U="
},
"node_modules/supports-color": {
"version": "5.5.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
@@ -11735,9 +11741,12 @@
}
},
"node_modules/vue-debounce": {
"version": "2.5.7",
"resolved": "https://registry.npmjs.org/vue-debounce/-/vue-debounce-2.5.7.tgz",
"integrity": "sha512-weyMz0ee6xHLCJ+HrvfkVUQqBsH7Jx359yLTmfnmpb8fDUD3HZEwd2ZHoq+sZjcYR7JcW7B9FlqjJA2IJXqscg=="
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/vue-debounce/-/vue-debounce-2.6.0.tgz",
"integrity": "sha512-afSu/LSIyZv7HjLqqmFwgp4k2OhAGIEa8XVH1MYw/qyf6ly7fJbyfUVOagbFRXP4yl61J0ujMVB31DRY0US6RA==",
"peerDependencies": {
"vue": ">= 2.0.0"
}
},
"node_modules/vue-hot-reload-api": {
"version": "2.3.4",
@@ -11869,9 +11878,9 @@
"dev": true
},
"node_modules/vue-router": {
"version": "3.4.5",
"resolved": "https://registry.npmjs.org/vue-router/-/vue-router-3.4.5.tgz",
"integrity": "sha512-ioRY5QyDpXM9TDjOX6hX79gtaMXSVDDzSlbIlyAmbHNteIL81WIVB2e+jbzV23vzxtoV0krdS2XHm+GxFg+Nxg=="
"version": "3.5.1",
"resolved": "https://registry.npmjs.org/vue-router/-/vue-router-3.5.1.tgz",
"integrity": "sha512-RRQNLT8Mzr8z7eL4p7BtKvRaTSGdCbTy2+Mm5HTJvLGYSSeG9gDzNasJPP/yOYKLy+/cLG/ftrqq5fvkFwBJEw=="
},
"node_modules/vue-style-loader": {
"version": "4.1.2",
@@ -11906,9 +11915,9 @@
"dev": true
},
"node_modules/vuetify": {
"version": "2.4.11",
"resolved": "https://registry.npmjs.org/vuetify/-/vuetify-2.4.11.tgz",
"integrity": "sha512-xFNwr95tFRfbyGNg5DBuUkWaKazMBr+ptzoSSL4PGrI0qItY5Vuusxh+ETPtjUXxwz76v5zVtGvF5rWvGQjy7A==",
"version": "2.5.0",
"resolved": "https://registry.npmjs.org/vuetify/-/vuetify-2.5.0.tgz",
"integrity": "sha512-Lpnwm64xYVEXb5BXdadSRaH0QHjXLFhPPjuVU9VuqWp3Nzr+WP5vA9nMPkJAfUj8vKIJGTRXqyGTGVa4VwrO3A==",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/johnleider"
@@ -11928,9 +11937,12 @@
}
},
"node_modules/vuex": {
"version": "3.5.1",
"resolved": "https://registry.npmjs.org/vuex/-/vuex-3.5.1.tgz",
"integrity": "sha512-w7oJzmHQs0FM9LXodfskhw9wgKBiaB+totOdb8sNzbTB2KDCEEwEs29NzBZFh/lmEK1t5tDmM1vtsO7ubG1DFw=="
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/vuex/-/vuex-3.6.2.tgz",
"integrity": "sha512-ETW44IqCgBpVomy520DT5jf8n0zoCac+sxWnn+hMe/CzaSejb/eVw2YToiXYX+Ex/AuHHia28vWTq4goAexFbw==",
"peerDependencies": {
"vue": "^2.0.0"
}
},
"node_modules/watchpack": {
"version": "1.7.2",
@@ -22335,6 +22347,11 @@
"when": "~3.6.x"
}
},
"suncalc": {
"version": "1.8.0",
"resolved": "https://registry.npmjs.org/suncalc/-/suncalc-1.8.0.tgz",
"integrity": "sha1-HZiYEJVjB4dQ9JlKlZ5lTYdqy/U="
},
"supports-color": {
"version": "5.5.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
@@ -22975,9 +22992,10 @@
}
},
"vue-debounce": {
"version": "2.5.7",
"resolved": "https://registry.npmjs.org/vue-debounce/-/vue-debounce-2.5.7.tgz",
"integrity": "sha512-weyMz0ee6xHLCJ+HrvfkVUQqBsH7Jx359yLTmfnmpb8fDUD3HZEwd2ZHoq+sZjcYR7JcW7B9FlqjJA2IJXqscg=="
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/vue-debounce/-/vue-debounce-2.6.0.tgz",
"integrity": "sha512-afSu/LSIyZv7HjLqqmFwgp4k2OhAGIEa8XVH1MYw/qyf6ly7fJbyfUVOagbFRXP4yl61J0ujMVB31DRY0US6RA==",
"requires": {}
},
"vue-hot-reload-api": {
"version": "2.3.4",
@@ -23088,9 +23106,9 @@
}
},
"vue-router": {
"version": "3.4.5",
"resolved": "https://registry.npmjs.org/vue-router/-/vue-router-3.4.5.tgz",
"integrity": "sha512-ioRY5QyDpXM9TDjOX6hX79gtaMXSVDDzSlbIlyAmbHNteIL81WIVB2e+jbzV23vzxtoV0krdS2XHm+GxFg+Nxg=="
"version": "3.5.1",
"resolved": "https://registry.npmjs.org/vue-router/-/vue-router-3.5.1.tgz",
"integrity": "sha512-RRQNLT8Mzr8z7eL4p7BtKvRaTSGdCbTy2+Mm5HTJvLGYSSeG9gDzNasJPP/yOYKLy+/cLG/ftrqq5fvkFwBJEw=="
},
"vue-style-loader": {
"version": "4.1.2",
@@ -23127,9 +23145,9 @@
"dev": true
},
"vuetify": {
"version": "2.4.11",
"resolved": "https://registry.npmjs.org/vuetify/-/vuetify-2.4.11.tgz",
"integrity": "sha512-xFNwr95tFRfbyGNg5DBuUkWaKazMBr+ptzoSSL4PGrI0qItY5Vuusxh+ETPtjUXxwz76v5zVtGvF5rWvGQjy7A==",
"version": "2.5.0",
"resolved": "https://registry.npmjs.org/vuetify/-/vuetify-2.5.0.tgz",
"integrity": "sha512-Lpnwm64xYVEXb5BXdadSRaH0QHjXLFhPPjuVU9VuqWp3Nzr+WP5vA9nMPkJAfUj8vKIJGTRXqyGTGVa4VwrO3A==",
"requires": {}
},
"vuetify-loader": {
@@ -23143,9 +23161,10 @@
}
},
"vuex": {
"version": "3.5.1",
"resolved": "https://registry.npmjs.org/vuex/-/vuex-3.5.1.tgz",
"integrity": "sha512-w7oJzmHQs0FM9LXodfskhw9wgKBiaB+totOdb8sNzbTB2KDCEEwEs29NzBZFh/lmEK1t5tDmM1vtsO7ubG1DFw=="
"version": "3.6.2",
"resolved": "https://registry.npmjs.org/vuex/-/vuex-3.6.2.tgz",
"integrity": "sha512-ETW44IqCgBpVomy520DT5jf8n0zoCac+sxWnn+hMe/CzaSejb/eVw2YToiXYX+Ex/AuHHia28vWTq4goAexFbw==",
"requires": {}
},
"watchpack": {
"version": "1.7.2",

View File

@@ -15,12 +15,13 @@
"leaflet-realtime": "^2.2.0",
"leaflet.markercluster": "^1.4.1",
"marked": "^2.0.3",
"suncalc": "^1.8.0",
"typeface-roboto": "0.0.75",
"vue": "^2.6.12",
"vue-debounce": "^2.5.7",
"vue-router": "^3.4.5",
"vuetify": "^2.4.11",
"vuex": "^3.5.1"
"vue-debounce": "^2.6.0",
"vue-router": "^3.5.1",
"vuetify": "^2.5.0",
"vuex": "^3.6.2"
},
"devDependencies": {
"@vue/cli-plugin-babel": "~4.4.0",

View File

@@ -65,6 +65,16 @@ export default {
snackText (newVal) {
this.snack = !!newVal;
},
snack (newVal) {
// When the snack is hidden (one way or another), clear
// the text so that if we receive the same message again
// afterwards it will be shown. This way, if we get spammed
// we're also not triggering the snack too often.
if (!newVal) {
this.$store.commit('setSnackText', "");
}
}
},

View File

@@ -12,6 +12,34 @@
<v-toolbar-title class="mx-2" @click="$router.push('/')" style="cursor: pointer;">Dougal</v-toolbar-title>
<v-spacer></v-spacer>
<v-menu bottom offset-y>
<template v-slot:activator="{on, attrs}">
<v-hover v-slot="{hover}">
<v-btn
class="align-self-center"
:xcolor="hover ? 'secondary' : 'secondary lighten-3'"
small
text
v-bind="attrs"
v-on="on"
title="Settings"
>
<v-icon small>mdi-cog-outline</v-icon>
</v-btn>
</v-hover>
</template>
<v-list dense>
<v-list-item :href="`/settings/equipment`">
<v-list-item-title>Equipment list</v-list-item-title>
<v-list-item-action><v-icon small>mdi-view-list</v-icon></v-list-item-action>
</v-list-item>
</v-list>
</v-menu>
<v-breadcrumbs :items="path"></v-breadcrumbs>
<template v-if="$route.name != 'Login'">

View File

@@ -41,6 +41,11 @@ Vue.use(VueRouter)
// which is lazy-loaded when the route is visited.
component: () => import(/* webpackChunkName: "about" */ '../views/Feed.vue')
},
{
path: "/settings/equipment",
name: "equipment",
component: () => import(/* webpackChunkName: "about" */ '../views/Equipment.vue')
},
{
pathToRegexpOptions: { strict: true },
path: "/login",
@@ -147,6 +152,7 @@ Vue.use(VueRouter)
},
{
path: "map",
name: "map",
component: Map
}
]

View File

@@ -13,7 +13,8 @@ async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
init.body = JSON.stringify(init.body);
}
}
const res = await fetch(`${state.apiUrl}${resource}`, init);
const url = /^https?:\/\//i.test(resource) ? resource : (state.apiUrl + resource);
const res = await fetch(url, init);
if (typeof cb === 'function') {
cb(null, res);
}

View File

@@ -3,4 +3,12 @@ function user (state) {
return state.user;
}
export default { user };
function writeaccess (state) {
return state.user && ["user", "admin"].includes(state.user.role);
}
function adminaccess (state) {
return state.user && state.user.role == "admin";
}
export default { user, writeaccess, adminaccess };

View File

@@ -0,0 +1,513 @@
<template>
<v-container fluid>
<v-row>
<v-col>
<v-dialog
max-width="600px"
:value="dialog"
@input="closeDialog"
>
<template v-slot:activator="{ on, attrs }">
<v-btn v-if="writeaccess"
small
color="primary"
v-bind="attrs"
v-on="on"
>Add</v-btn>
</template>
<v-card>
<v-card-title v-if="dialogMode=='new'">Add new item</v-card-title>
<v-card-title v-else>Edit item</v-card-title>
<v-card-text>
<v-container>
<v-row>
<v-col cols="12">
<v-text-field
label="Kind"
required
v-model="item.kind"
:disabled="dialogMode == 'edit'"
>
</v-text-field>
</v-col>
<v-col cols="12">
<v-textarea
class="markdown"
label="Description"
dense
auto-grow
rows="1"
v-model="item.description"
>
</v-textarea>
</v-col>
<v-col cols="6">
<v-text-field
label="Date"
type="date"
step="1"
v-model="item.date"
>
</v-text-field>
</v-col>
<v-col cols="6">
<v-text-field
label="Time"
type="time"
step="60"
v-model="item.time"
>
</v-text-field>
</v-col>
<template v-for="(attr, idx) in item.attributes">
<v-col cols="4">
<v-text-field
label="Attribute"
v-model="attr.key"
>
</v-text-field>
</v-col>
<v-col cols="8">
<v-textarea
label="Value"
class="markdown"
auto-grow
rows="1"
v-model="attr.value"
>
<template v-slot:append-outer>
<v-btn
fab
x-small
dark
color="red"
title="Remove this attribute / value pair"
@click="removeAttribute(idx)"
>
<v-icon>mdi-minus</v-icon>
</v-btn>
</template>
</v-textarea>
</v-col>
</template>
<v-col cols="12" class="text-right">
<v-btn
fab
x-small
color="primary"
title="Add a new attribute / value pair to further describe the equipment"
@click="addAttribute"
>
<v-icon>mdi-plus</v-icon>
</v-btn>
</v-col>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-btn
color="warning"
@click="closeDialog"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
color="success"
:loading="loading"
:disabled="!canSave || loading"
@click="saveItem"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</v-col>
</v-row>
<v-row>
<v-col cols="4">
<v-toolbar
dense
flat
>
<v-toolbar-title>
Equipment
</v-toolbar-title>
</v-toolbar>
<v-list dense two-line>
<v-subheader v-if="!latest.length">
There are no items of equipment
</v-subheader>
<v-list-item-group
v-model="selectedIndex"
color="primary"
>
<v-list-item v-for="(item, idx) in latest" :key="idx">
<v-list-item-content>
<v-list-item-title>
{{item.kind}}
</v-list-item-title>
<v-list-item-subtitle>
Last updated: {{item.tstamp.substring(0,16)}}Z
</v-list-item-subtitle>
</v-list-item-content>
</v-list-item>
</v-list-item-group>
</v-list>
</v-col>
<v-col cols="8">
<v-card v-if="selectedItem">
<v-card-title>{{selectedItem.kind}}</v-card-title>
<v-card-subtitle class="text-caption">{{selectedItem.tstamp}}</v-card-subtitle>
<v-card-text>
<v-container>
<v-row>
<div v-html="$options.filters.markdown(selectedItem.description||'')"></div>
</v-row>
<v-row>
<v-simple-table>
<template v-slot:default>
<tbody>
<tr v-for="(attr, idx) in selectedItem.attributes" :key="idx">
<td>{{attr.key}}</td>
<td v-html="$options.filters.markdown(attr.value||'')"></td>
</tr>
</tbody>
</template>
</v-simple-table>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-btn v-if="writeaccess"
small
text
color="primary"
title="Make a change to this item"
@click="editItem(selectedItem)"
>
Update
</v-btn>
<v-btn-toggle
group
v-model="historyMode"
>
<v-btn
small
text
:disabled="false"
title="View item's full history of changes"
>
History
</v-btn>
</v-btn-toggle>
<v-spacer></v-spacer>
<v-btn v-if="writeaccess"
small
dark
color="red"
title="Remove this instance from the item's history"
@click="confirmDelete(selectedItem)"
>
Delete
</v-btn>
</v-card-actions>
</v-card>
<v-subheader v-else-if="latest.length" class="justify-center">Select an item from the list</v-subheader>
<v-expand-transition v-if="selectedItem">
<div v-if="historyMode===0">
<v-subheader v-if="!selectedItemHistory || !selectedItemHistory.length"
class="justify-center"
>No more history</v-subheader>
<v-card v-for="item in selectedItemHistory" class="mt-5">
<v-card-title>{{selectedItem.kind}}</v-card-title>
<v-card-subtitle class="text-caption">{{item.tstamp}}</v-card-subtitle>
<v-card-text>
<v-container>
<v-row>
<div v-html="$options.filters.markdown(item.description||'')"></div>
</v-row>
<v-row>
<v-simple-table>
<template v-slot:default>
<tbody>
<tr v-for="(attr, idx) in item.attributes" :key="idx">
<td>{{attr.key}}</td>
<td v-html="$options.filters.markdown(attr.value||'')"></td>
</tr>
</tbody>
</template>
</v-simple-table>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-spacer></v-spacer>
<v-btn v-if="writeaccess"
small
dark
color="red"
title="Remove this instance from the item's history"
@click="confirmDelete(item)"
>
Delete
</v-btn>
</v-card-actions>
</v-card>
</div>
</v-expand-transition>
</v-col>
</v-row>
<v-dialog
:value="confirm.message"
max-width="500px"
persistent
>
<v-sheet
class="px-7 pt-7 pb-4 mx-auto text-center d-inline-block"
color="blue-grey darken-3"
dark
>
<div class="grey--text text--lighten-1 text-body-2 mb-4" v-html="confirm.message"></div>
<v-btn
:disabled="loading"
class="ma-1"
color="grey"
plain
@click="cancelConfirmAction"
>
{{ confirm.no || "Cancel" }}
</v-btn>
<v-btn
:loading="loading"
class="ma-1"
color="error"
plain
@click="doConfirmAction"
>
{{ confirm.yes || "Delete" }}
</v-btn>
</v-sheet>
</v-dialog>
</v-container>
</template>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: "Equipment",
data () {
return {
latest: [],
all: [],
item: {
kind: null,
description: null,
tstamp: null,
date: null,
time: null,
attributes: []
},
dialogMode: null,
selectedIndex: null,
historyMode: false,
confirm: {
message: null,
action: null,
yes: null,
no: null
}
}
},
watch: {
dialog (newVal, oldVal) {
if (newVal) {
const tstamp = new Date();
this.item.date = tstamp.toISOString().substr(0, 10);
this.item.time = tstamp.toISOString().substr(11, 5);
}
},
"item.date": function (newVal) {
if (newVal) {
this.item.tstamp = new Date(this.item.date+"T"+this.item.time);
}
},
"item.time": function (newVal) {
if (newVal) {
this.item.tstamp = new Date(this.item.date+"T"+this.item.time);
}
},
async serverEvent (event) {
if (event.payload.schema == "public") {
if (event.channel == "info") {
if (!this.loading) {
this.getEquipment();
}
}
}
}
},
computed: {
dialog () {
return !!this.dialogMode;
},
canSave () {
return this.item.kind &&
this.item.date && this.item.time &&
(this.item.attributes.length
? this.item.attributes.every(i => i.key && i.value)
: (this.item.description ||"").trim());
},
selectedItem () {
return this.selectedIndex !== null
? this.latest[this.selectedIndex]
: null;
},
selectedItemHistory () {
if (this.selectedItem && this.historyMode === 0) {
const items = this.all
.filter(i => i.kind == this.selectedItem.kind && i.tstamp != this.selectedItem.tstamp)
.sort( (a, b) => new Date(b.tstamp) - new Date(a.tstamp) );
return items;
}
return null;
},
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
methods: {
async cancelConfirmAction () {
this.confirm.action = null;
this.confirm.message = null;
this.confirm.yes = null;
this.confirm.no = null;
},
async doConfirmAction () {
await this.confirm.action();
this.cancelConfirmAction();
},
async getEquipment () {
const url = `/info/equipment`;
const items = await this.api([url]) || [];
this.all = [...items];
this.latest = this.all.filter(i =>
!this.all.find(j => i.kind == j.kind && i.tstamp < j.tstamp)
)
.sort( (a, b) => a.kind < b.kind ? -1 : a.kind > b.kind ? 1 : 0 );
},
addAttribute () {
this.item.attributes.push({key: undefined, value: undefined});
},
removeAttribute (idx) {
this.item.attributes.splice(idx, 1);
},
async deleteItem (item) {
const idx = this.all.findIndex(i => i.kind == item.kind && i.tstamp == item.tstamp);
if (idx == -1) {
return;
}
const url = `/info/equipment/${idx}`;
const init = {
method: "DELETE"
};
await this.api([url, init]);
await this.getEquipment();
},
confirmDelete (item) {
this.confirm.action = () => this.deleteItem(item);
this.confirm.message = "Are you sure? <b>This action is irreversible.</b>";
},
clearItem () {
this.item.kind = null;
this.item.description = null;
this.item.date = null;
this.item.time = null;
this.item.attributes = [];
},
editItem (item) {
this.item.kind = item.kind;
this.item.description = item.description;
this.item.tstamp = new Date();
this.item.attributes = [...item.attributes];
this.dialogMode = "edit";
this.dialog = true;
},
async saveItem () {
const item = {};
item.kind = this.item.kind;
item.description = this.item.description;
item.tstamp = this.item.tstamp.toISOString();
item.attributes = [...this.item.attributes.filter(i => i.key && i.value)];
if (this.dialogMode == "edit") {
this.latest.splice(this.selectedIndex, 1, item);
} else {
this.latest.push(item);
}
const url = `/info/equipment`;
const init = {
method: "POST",
body: item
};
await this.api([url, init]);
this.closeDialog();
await this.getEquipment();
},
clearItem () {
this.item.kind = null;
this.item.description = null;
this.item.attributes = [];
this.item.tstamp = null;
},
closeDialog (state = false) {
this.clearItem();
this.dialogMode = state===true ? "new" : null;
},
...mapActions(["api"])
},
async mounted () {
await this.getEquipment();
}
}
</script>

View File

@@ -16,7 +16,7 @@
</v-card-title>
<v-card-text>
<v-menu
<v-menu v-if="writeaccess"
v-model="contextMenuShow"
:position-x="contextMenuX"
:position-y="contextMenuY"
@@ -25,9 +25,21 @@
>
<v-list dense v-if="contextMenuItem">
<template v-if="!selectOn">
<v-list-item @click="setNTBA">
<v-list-item-title v-if="contextMenuItem.ntba">Unset NTBA</v-list-item-title>
<v-list-item-title v-else>Set NTBA</v-list-item-title>
<v-list-item @click="setNTBA" v-if="contextMenuItem.ntba || (contextMenuItem.num_points == contextMenuItem.na)">
<v-list-item-title v-if="contextMenuItem.ntba"
title="Mark the line as part of the acquisition plan"
>Unset NTBA</v-list-item-title>
<v-list-item-title v-else
title="Mark the line as not to be acquired"
>Set NTBA</v-list-item-title>
</v-list-item>
<v-list-item @click="setComplete" v-if="contextMenuItem.na && (contextMenuItem.num_points != contextMenuItem.na || contextMenuItem.tba != contextMenuItem.na)">
<v-list-item-title v-if="contextMenuItem.tba != contextMenuItem.na"
title="Mark any remaining points as pending acquisition"
>Unset line complete</v-list-item-title>
<v-list-item-title v-else
title="Mark any remaining points as not to be acquired"
>Set line complete</v-list-item-title>
</v-list-item>
<v-list-item @click="addToPlan" v-if="!contextMenuItem.ntba && !isPlanned(contextMenuItem)">
<v-list-item-title>Add to plan</v-list-item-title>
@@ -120,6 +132,10 @@
</dougal-line-status>
</template>
<template v-slot:item.tba="{item, value}">
<span :class="!value && (item.na ? 'warning--text' : 'success--text')">{{ value }}</span>
</template>
<template v-slot:item.length="props">
<span>{{ Math.round(props.value) }} m</span>
</template>
@@ -139,8 +155,9 @@
@click:append-outer="edit = null"
>
</v-text-field>
<div v-else v-html="$options.filters.markdownInline(item.remarks)">
<v-btn v-if="edit === null"
<div v-else>
<span v-html="$options.filters.markdownInline(item.remarks)"></span>
<v-btn v-if="writeaccess && edit === null"
icon
small
title="Edit"
@@ -203,7 +220,17 @@ export default {
},
{
value: "num_points",
text: "Num. points",
text: "Points",
align: "end"
},
{
value: "na",
text: "Virgin",
align: "end"
},
{
value: "tba",
text: "Remaining",
align: "end"
},
{
@@ -246,7 +273,7 @@ export default {
},
computed: {
...mapGetters(['user', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
watch: {
@@ -271,7 +298,7 @@ export default {
async serverEvent (event) {
if (event.payload.pid == this.$route.params.project) {
if (event.channel == "preplot_lines") {
if (event.channel == "preplot_lines" || event.channel == "preplot_points") {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
@@ -352,6 +379,14 @@ export default {
value: !this.contextMenuItem.ntba
})
},
setComplete () {
this.saveItem({
line: this.contextMenuItem.line,
key: 'complete',
value: this.contextMenuItem.na && this.contextMenuItem.tba == this.contextMenuItem.na
})
},
async addToPlan () {
const payload = {

View File

@@ -33,7 +33,7 @@
</span>
</v-toolbar-title>
<dougal-event-edit-dialog
<dougal-event-edit-dialog v-if="writeaccess"
v-model="eventDialog"
:allowed-labels="userLabels"
:preset-remarks="presetRemarks"
@@ -108,141 +108,146 @@
</template>
<template v-slot:item.remarks="{item}">
<v-edit-dialog v-if="item.items"
large
@save="rowEditorSave"
@cancel="rowEditorCancel"
@open="rowEditorOpen(item)"
@close="rowEditorClose"
> <div v-html="$options.filters.markdownInline(item.items.map(i => i.remarks).join('<br/>'))"></div>
<template v-slot:input>
<h3>{{
editedRow.sequence
? `${editedRow.sequence} @ ${editedRow.point}`
: editedRow.tstamp
? editedRow.tstamp.replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2")
: editedRow.key
}}</h3><hr/>
<template v-if="writeaccess">
<v-edit-dialog v-if="item.items"
large
@save="rowEditorSave"
@cancel="rowEditorCancel"
@open="rowEditorOpen(item)"
@close="rowEditorClose"
> <div v-html="$options.filters.markdownInline(item.items.map(i => i.remarks).join('<br/>'))"></div>
<template v-slot:input>
<h3>{{
editedRow.sequence
? `${editedRow.sequence} @ ${editedRow.point}`
: editedRow.tstamp
? editedRow.tstamp.replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2")
: editedRow.key
}}</h3><hr/>
<dougal-context-menu
:value="remarksMenu"
@input="addPresetRemark"
:items="presetRemarks"
absolute
></dougal-context-menu>
<dougal-context-menu
:value="remarksMenu"
@input="addPresetRemark"
:items="presetRemarks"
absolute
></dougal-context-menu>
<template v-for="editedItem in editedRow.items">
<template v-for="editedItem in editedRow.items">
<v-text-field
v-model="editedItem.remarks"
label="Edit"
single-line
hide-details="auto"
>
<template v-slot:prepend>
<v-icon v-show="!editedItem.remarks && presetRemarks"
title="Select predefined comments"
color="primary"
@click="(e) => {remarksMenuItem = editedItem; remarksMenu = e}"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:append v-if="editedItem.remarks || editedItem.labels.filter(l => labels[l].model.user).length">
<v-hover v-slot:default="{hover}">
<v-icon
title="Remove comment"
:color="hover ? 'error' : 'error lighten-4'"
@click="removeEvent(editedItem, editedRow)"
>mdi-minus-circle</v-icon>
</v-hover>
</template>
</v-text-field>
<v-container>
<v-row no-gutters>
<v-col class="flex-grow-0">
<!-- Add a new label control -->
<v-edit-dialog
large
@save="addLabel(editedItem)"
@cancel="selectedLabels=[]"
>
<v-icon
small
title="Add label"
>mdi-tag-plus</v-icon>
<template v-slot:input>
<v-autocomplete
:items="availableLabels(editedItem.labels)"
v-model="selectedLabels"
label="Add label"
chips
deletable-chips
multiple
autofocus
@keydown.stop="(e) => {if (e.key == 'Enter') debug(e)}"
@input="labelSearch = null;"
:search-input.sync="labelSearch"
>
<template v-slot:selection="data">
<v-chip
v-bind="data.attrs"
:input-value="data.selected"
small
@click="data.select"
:color="labels[data.item].view.colour"
:title="labels[data.item].view.description"
>{{data.item}}</v-chip>
</template>
</v-autocomplete>
</template>
</v-edit-dialog>
</v-col>
<v-col class="flex-grow-0">
<v-chip-group>
<v-chip v-for="label in editedItem.labels" :key="label"
small
:close="labels[label].model.user"
:color="labels[label].view.colour"
:title="labels[label].view.description"
@click:close="removeLabel(label, editedItem)"
>{{label}}</v-chip>
</v-chip-group>
</v-col>
</v-row>
</v-container>
</template>
<v-icon v-if="editedRow.items.length == 0 || editedRow.items[editedRow.items.length-1].remarks"
color="primary"
title="Add comment"
class="mb-2"
@click="addEvent"
>mdi-plus-circle</v-icon>
</template>
</v-edit-dialog>
<v-edit-dialog v-else
@save="rowEditorSave"
@cancel="rowEditorCancel"
@open="rowEditorOpen"
@close="rowEditorClose"
>
<template v-slot:input>
<v-text-field
v-model="editedItem.remarks"
v-model="props.item.remarks[0]"
label="Edit"
single-line
hide-details="auto"
>
<template v-slot:prepend>
<v-icon v-show="!editedItem.remarks && presetRemarks"
title="Select predefined comments"
color="primary"
@click="(e) => {remarksMenuItem = editedItem; remarksMenu = e}"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:append v-if="editedItem.remarks || editedItem.labels.filter(l => labels[l].model.user).length">
<v-hover v-slot:default="{hover}">
<v-icon
title="Remove comment"
:color="hover ? 'error' : 'error lighten-4'"
@click="removeEvent(editedItem, editedRow)"
>mdi-minus-circle</v-icon>
</v-hover>
</template>
</v-text-field>
<v-container>
<v-row no-gutters>
<v-col class="flex-grow-0">
<!-- Add a new label control -->
<v-edit-dialog
large
@save="addLabel(editedItem)"
@cancel="selectedLabels=[]"
>
<v-icon
small
title="Add label"
>mdi-tag-plus</v-icon>
<template v-slot:input>
<v-autocomplete
:items="availableLabels(editedItem.labels)"
v-model="selectedLabels"
label="Add label"
chips
deletable-chips
multiple
autofocus
@keydown.stop="(e) => {if (e.key == 'Enter') debug(e)}"
@input="labelSearch = null;"
:search-input.sync="labelSearch"
>
<template v-slot:selection="data">
<v-chip
v-bind="data.attrs"
:input-value="data.selected"
small
@click="data.select"
:color="labels[data.item].view.colour"
:title="labels[data.item].view.description"
>{{data.item}}</v-chip>
</template>
</v-autocomplete>
</template>
</v-edit-dialog>
</v-col>
<v-col class="flex-grow-0">
<v-chip-group>
<v-chip v-for="label in editedItem.labels" :key="label"
small
:close="labels[label].model.user"
:color="labels[label].view.colour"
:title="labels[label].view.description"
@click:close="removeLabel(label, editedItem)"
>{{label}}</v-chip>
</v-chip-group>
</v-col>
</v-row>
</v-container>
></v-text-field>
</template>
<v-icon v-if="editedRow.items.length == 0 || editedRow.items[editedRow.items.length-1].remarks"
color="primary"
title="Add comment"
class="mb-2"
@click="addEvent"
>mdi-plus-circle</v-icon>
</template>
</v-edit-dialog>
<v-edit-dialog v-else
@save="rowEditorSave"
@cancel="rowEditorCancel"
@open="rowEditorOpen"
@close="rowEditorClose"
>
<template v-slot:input>
<v-text-field
v-model="props.item.remarks[0]"
label="Edit"
single-line
></v-text-field>
</template>
</v-edit-dialog>
</v-edit-dialog>
</template>
<template v-else>
<div v-html="$options.filters.markdownInline(item.items.map(i => i.remarks).join('<br/>'))"></div>
</template>
</template>
@@ -262,15 +267,15 @@
<!-- Actions column (FIXME currently not used) -->
<template v-slot:item.actions="{ item }">
<div style="white-space:nowrap;">
<v-icon v-if="$root.user || true"
small
class="mr-2"
title="View on map"
@click="viewOnMap(item)"
disabled
>
mdi-map
</v-icon>
<a :href="viewOnMap(item)" v-if="viewOnMap(item)">
<v-icon v-if="$root.user || true"
small
class="mr-2"
title="View on map"
>
mdi-map
</v-icon>
</a>
</div>
</template>
@@ -400,7 +405,7 @@ export default {
}
},
...mapGetters(['user', 'loading', 'online', 'sequence', 'line', 'point', 'lineName', 'serverEvent']),
...mapGetters(['user', 'writeaccess', 'loading', 'online', 'sequence', 'line', 'point', 'lineName', 'serverEvent']),
...mapState({projectSchema: state => state.project.projectSchema})
},
@@ -527,14 +532,19 @@ export default {
},
async saveEvent (event) {
const callback = (err, res) => {
if (!err && res.ok) {
this.showSnack(["New event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
}
}
const url = `/project/${this.$route.params.project}/event`;
await this.api([url, {
method: "POST",
body: event
}]);
this.showSnack(["New event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
}, callback]);
},
rowEditorOpen (row) {
@@ -736,6 +746,15 @@ export default {
item.items.some( i => i.labels.some( l => l.toLowerCase().includes(s) ));
}
},
viewOnMap(row) {
if (row && row.items && row.items.length) {
if (row.items[0].geometry && row.items[0].geometry.type == "Point") {
const [ lon, lat ] = row.items[0].geometry.coordinates;
return `map#15/${lon.toFixed(6)}/${lat.toFixed(6)}`;
}
}
},
...mapActions(["api", "showSnack"])
},

View File

@@ -88,6 +88,30 @@ const layers = {
},
}),
"Saillines": L.geoJSON(null, {
pointToLayer (point, latlng) {
return L.circle(latlng, {
radius: 1,
color: "#3388ff",
stroke: false,
fillOpacity: 0.8
});
},
style (feature) {
return {
opacity: feature.properties.ntba ? 0.2 : 0.5,
color: "cyan"
}
},
onEachFeature (feature, layer) {
const p = feature.properties;
const popup = feature.geometry.type == "Point"
? `Preplot<br/>Point <b>${p.line} / ${p.point}</b>`
: `Preplot${p.ntba? " (NTBA)":""}<br/>Line <b>${p.line}</b>${p.remarks ? markdown(p.remarks) : ""}`;
layer.bindTooltip(popup, {sticky: true});
},
}),
"Plan": L.geoJSON(null, {
arrowheads: {
size: "8px",
@@ -290,7 +314,7 @@ function makeRealTimePopup(feature) {
Position as of ${p.tstamp}<br/><hr/>
${online}
<table>
<tr><td><b>Speed:</b></td><td>${p.speed ? p.speed*3.6/1.852 : "???"} kt</td></tr>
<tr><td><b>Speed:</b></td><td>${p.speed ? (p.speed*3.6/1.852).toFixed(1) : "???"} kt</td></tr>
<tr><td><b>CMG:</b></td><td>${p.cmg || "???"}°</td></tr>
<tr><td><b>Water depth:</b></td><td>${p.waterDepth || "???"} m</td></tr>
<tr><td><b>WGS84:</b></td><td>${wgs84}</td></tr>
@@ -317,6 +341,16 @@ export default {
: `/project/${this.$route.params.project}/gis/preplot/point?${query.toString()}`;
}
},
{
layer: layers["Saillines"],
url: (query = "") => {
const q = new URLSearchParams(query);
q.set("class", "V");
return map.getZoom() < 18
? `/project/${this.$route.params.project}/gis/preplot/line?${q.toString()}`
: `/project/${this.$route.params.project}/gis/preplot/point?${q.toString()}`;
}
},
{
layer: layers.Plan,
url: (query = "") => {
@@ -339,7 +373,8 @@ export default {
: `/project/${this.$route.params.project}/gis/final/point?${query.toString()}`;
}
}
]
],
hashMarker: null
};
},
@@ -380,6 +415,12 @@ export default {
} else if (event.channel == "event" && event.payload.schema == this.projectSchema) {
//console.log("EVENT", event);
}
},
$route (to, from) {
if (to.name == "map") {
this.setHashMarker();
}
}
},
@@ -587,6 +628,48 @@ export default {
map.on('layerremove', this.updateURL);
},
setHashMarker () {
const crosshairsMarkerIcon = L.divIcon({
iconSize: [20, 20],
iconAnchor: [10, 10],
className: 'svgmarker',
html: `
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16">
<path style="fill:inherit;fill-opacity:1;stroke:none"
d="M 7 3 L 7 4.03125 A 4.5 4.5 0 0 0 3.0332031 8 L 2 8 L 2 9 L 3.03125 9 A 4.5 4.5 0 0 0 7 12.966797 L 7 14 L 8 14 L 8 12.96875 A 4.5 4.5 0 0 0 11.966797 9 L 13 9 L 13 8 L 11.96875 8 A 4.5 4.5 0 0 0 8 4.0332031 L 8 3 L 7 3 z M 7 5.0390625 L 7 8 L 4.0410156 8 A 3.5 3.5 0 0 1 7 5.0390625 z M 8 5.0410156 A 3.5 3.5 0 0 1 10.960938 8 L 8 8 L 8 5.0410156 z M 4.0390625 9 L 7 9 L 7 11.958984 A 3.5 3.5 0 0 1 4.0390625 9 z M 8 9 L 10.958984 9 A 3.5 3.5 0 0 1 8 11.960938 L 8 9 z "
/>
</svg>
`
});
const updateMarker = (latlng) => {
if (this.hashMarker) {
if (latlng) {
this.hashMarker.setLatLng(latlng);
} else {
map.removeLayer(this.hashMarker);
this.hashMarker = null;
}
} else if (latlng) {
this.hashMarker = L.marker(latlng, {icon: crosshairsMarkerIcon, interactive: false});
this.hashMarker.addTo(map).getElement().style.fill = "fuchsia";
}
}
const parts = document.location.hash.substring(1).split(":")[0].split("/").map(p => decodeURIComponent(p));
if (parts.length == 3) {
setTimeout(() => map.setView(parts.slice(1).reverse(), parts[0]), 500);
updateMarker(parts.slice(1).reverse());
} else if (parts.length == 2) {
parts.reverse();
setTimeout(() => map.panTo(parts), 500);
updateMarker(parts);
} else {
updateMarker();
}
},
...mapActions(["api"])
@@ -745,6 +828,9 @@ export default {
});
(new LoadingControl({position: "bottomright"})).addTo(map);
// Decode a position if one given in the hash
this.setHashMarker();
}
}

View File

@@ -16,7 +16,7 @@
</v-card-title>
<v-card-text>
<v-menu
<v-menu v-if="writeaccess"
v-model="contextMenuShow"
:position-x="contextMenuX"
:position-y="contextMenuY"
@@ -44,8 +44,12 @@
@contextmenu:row="contextMenu"
>
<template v-slot:item.srss="{item}">
<v-icon small :title="srssInfo(item)">{{srssIcon(item)}}</v-icon>
</template>
<template v-slot:item.sequence="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'sequence')"
@save="edit = null"
@@ -66,10 +70,11 @@
></v-checkbox>
</template>
</v-edit-dialog>
<span v-else>{{ value }}</span>
</template>
<template v-slot:item.name="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'name')"
@save="edit = null"
@@ -84,10 +89,11 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ value }}</span>
</template>
<template v-slot:item.fsp="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'fsp')"
@save="edit = null"
@@ -103,10 +109,11 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ value }}</span>
</template>
<template v-slot:item.lsp="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'lsp')"
@save="edit = null"
@@ -122,12 +129,13 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ value }}</span>
</template>
<template v-slot:item.ts0="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'ts0', item.ts1.toISOString())"
@open="editItem(item, 'ts0', item.ts0.toISOString())"
@save="edit = null"
@cancel="edit.value = item.ts0; edit = null"
>
@@ -141,10 +149,11 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ value.toISOString ? value.toISOString().slice(0, 16) : "" }}</span>
</template>
<template v-slot:item.ts1="{item, value}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'ts1', item.ts1.toISOString())"
@save="edit = null"
@@ -160,6 +169,7 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ value.toISOString ? value.toISOString().slice(0, 16) : "" }}</span>
</template>
<template v-slot:item.length="props">
@@ -171,7 +181,7 @@
</template>
<template v-slot:item.remarks="{item}">
<v-text-field v-if="edit && edit.sequence == item.sequence && edit.key == 'remarks'"
<v-text-field v-if="writeaccess && edit && edit.sequence == item.sequence && edit.key == 'remarks'"
type="text"
v-model="edit.value"
prepend-icon="mdi-restore"
@@ -181,8 +191,9 @@
@click:append-outer="edit = null"
>
</v-text-field>
<div v-else v-html="$options.filters.markdownInline(item.remarks)">
<v-btn v-if="edit === null"
<div v-else>
<span v-html="$options.filters.markdownInline(item.remarks)"></span>
<v-btn v-if="edit === null && writeaccess"
icon
small
title="Edit"
@@ -196,7 +207,7 @@
</template>
<template v-slot:item.speed="{item}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'speed', knots(item).toFixed(1))"
@save="edit = null"
@@ -214,10 +225,11 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else style="white-space:nowrap;">{{ knots(item).toFixed(1) }} kt</span>
</template>
<template v-slot:item.lag="{item}">
<v-edit-dialog
<v-edit-dialog v-if="writeaccess"
large
@open="editItem(item, 'lagAfter', Math.round(lagAfter(item)/(60*1000)))"
@save="edit = null"
@@ -234,6 +246,7 @@
</v-text-field>
</template>
</v-edit-dialog>
<span v-else>{{ Math.round(lagAfter(item) / (60*1000)) }} min</span>
</template>
</v-data-table>
@@ -247,10 +260,11 @@
</style>
<script>
import suncalc from 'suncalc';
import { mapActions, mapGetters } from 'vuex';
export default {
name: "LineList",
name: "Plan",
components: {
},
@@ -262,6 +276,10 @@ export default {
value: "sequence",
text: "Sequence"
},
{
value: "srss",
text: "SR/SS"
},
{
value: "name",
text: "Name"
@@ -337,7 +355,7 @@ export default {
},
computed: {
...mapGetters(['user', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
watch: {
@@ -351,35 +369,37 @@ export default {
if (oldVal.value === null) oldVal.value = "";
if (item) {
if (oldVal.key == "lagAfter") {
// We need to shift the times for every subsequent sequence
const delta = oldVal.value*60*1000 - this.lagAfter(item);
await this.shiftTimesAfter(item, delta);
} else if (oldVal.key == "speed") {
const v = oldVal.value*(1.852/3.6)/1000; // m/s
const ts1 = new Date(item.ts0.valueOf() + item.length / v);
const delta = ts1 - item.ts1;
await this.shiftTimesAfter(item, delta);
await this.saveItem({sequence: item.sequence, key: 'ts1', value: ts1});
} else if (oldVal.key == "sequence") {
if (this.shiftAll) {
await this.shiftSequences(oldVal.value-item.sequence);
} else {
await this.shiftSequence(item, oldVal.value);
if (item[oldVal.key] != oldVal.value) {
if (oldVal.key == "lagAfter") {
// Convert from minutes to seconds
oldVal.value *= 60;
} else if (oldVal.key == "speed") {
// Convert knots to metres per second
oldVal.value = oldVal.value*(1.852/3.6);
}
} else if (item[oldVal.key] != oldVal.value) {
if (await this.saveItem(oldVal)) {
item[oldVal.key] = oldVal.value;
} else {
this.edit = oldVal;
}
}
}
}
},
async serverEvent (event) {
if (event.channel == "planned_lines" && event.payload.pid == this.$route.params.project) {
// Ignore non-ops
/*
if (event.payload.old === null && event.payload.new === null) {
return;
}
*/
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
@@ -415,6 +435,120 @@ export default {
},
methods: {
suntimes (line) {
const oneday = 86400000;
function isDay (srss, ts, lat, lng) {
if (isNaN(srss.sunriseEnd) || isNaN(srss.sunsetStart)) {
// Between March and September
ts = new Date(ts);
if (ts.getMonth() >= 2 && ts.getMonth() <= 8) {
// Polar day in the Northern hemisphere, night in the South
return lat > 0;
} else {
return lat < 0;
}
} else {
if (srss.sunriseEnd < ts) {
if (ts < srss.sunsetStart) {
return true;
} else {
return suncalc.getTimes(new Date(ts.valueOf() + oneday), lat, lng).sunriseEnd < ts;
}
} else {
return ts < suncalc.getTimes(new Date(ts.valueOf() - oneday), lat, lng).sunsetStart;
}
}
}
let {ts0, ts1} = line;
const [ lng0, lat0 ] = line.geometry.coordinates[0];
const [ lng1, lat1 ] = line.geometry.coordinates[1];
if (ts1-ts0 > oneday) {
console.warn("Cannot provide reliable sunrise / sunset times for lines over 24 hr in this version");
//return null;
}
const srss0 = suncalc.getTimes(ts0, lat0, lng0);
const srss1 = suncalc.getTimes(ts1, lat1, lng1);
srss0.prevDay = suncalc.getTimes(new Date(ts0.valueOf()-oneday), lat0, lng0);
srss1.nextDay = suncalc.getTimes(new Date(ts1.valueOf()+oneday), lat1, lng1);
srss0.isDay = isDay(srss0, ts0, lat0, lng0);
srss1.isDay = isDay(srss1, ts1, lat1, lng1);
return {
ts0: srss0,
ts1: srss1
};
},
srssIcon (line) {
const srss = this.suntimes(line);
const moon = suncalc.getMoonIllumination(line.ts0);
return srss.ts0.isDay && srss.ts1.isDay
? 'mdi-weather-sunny'
: !srss.ts0.isDay && !srss.ts1.isDay
? moon.phase < 0.05
? 'mdi-moon-new'
: moon.phase < 0.25
? 'mdi-moon-waxing-crescent'
: moon.phase < 0.45
? 'mdi-moon-waxing-gibbous'
: moon.phase < 0.55
? 'mdi-moon-full'
: moon.phase < 0.75
? 'mdi-moon-waning-gibbous'
: 'mdi-moon-waning-crescent'
: 'mdi-theme-light-dark';
},
srssMoonPhase (line) {
const ts = new Date((Number(line.ts0)+Number(line.ts1))/2);
const moon = suncalc.getMoonIllumination(ts);
return moon.phase < 0.05
? 'New moon'
: moon.phase < 0.25
? 'Waxing crescent moon'
: moon.phase < 0.45
? 'Waxing gibbous moon'
: moon.phase < 0.55
? 'Full moon'
: moon.phase < 0.75
? 'Waning gibbous moon'
: 'Waning crescent moon';
},
srssInfo (line) {
const srss = this.suntimes(line);
const text = [];
try {
text.push(`Sunset at\t${srss.ts0.prevDay.sunset.toISOString().substr(0, 16)}Z (FSP)`);
text.push(`Sunrise at\t${srss.ts0.sunrise.toISOString().substr(0, 16)}Z (FSP)`);
text.push(`Sunset at\t${srss.ts0.sunset.toISOString().substr(0, 16)}Z (FSP)`);
if (line.ts0.getUTCDate() != line.ts1.getUTCDate()) {
text.push(`Sunrise at\t${srss.ts1.sunrise.toISOString().substr(0, 16)}Z (LSP)`);
text.push(`Sunset at\t${srss.ts1.sunset.toISOString().substr(0, 16)}Z (LSP)`);
}
text.push(`Sunrise at\t${srss.ts1.nextDay.sunrise.toISOString().substr(0, 16)}Z (LSP)`);
} catch (err) {
if (err instanceof RangeError) {
text.push(srss.ts0.isDay ? "Polar day" : "Polar night");
} else {
console.log("ERROR", err);
}
}
if (!srss.ts0.isDay || !srss.ts1.isDay) {
text.push(this.srssMoonPhase(line));
}
return text.join("\n");
},
lagAfter (item) {
const pos = this.items.indexOf(item)+1;
@@ -450,92 +584,7 @@ export default {
await this.api([url, init]);
await this.getPlannedLines();
},
async shiftSequences(delta) {
const lines = delta < 0
? this.items
: [...this.items].reverse(); // We go backwards so as to avoid conflicts.
for (const line of lines) {
const sequence = line.sequence+delta;
const url = `/project/${this.$route.params.project}/plan/${line.sequence}`;
const init = {
method: "PATCH",
headers: {"Content-Type": "application/json"},
body: {sequence, name: null} // Setting name to null causes it to be regenerated
}
await this.api([url, init]);
}
},
async shiftSequence (item, newSequence) {
if (item.sequence == newSequence) {
// Nothing to do
return;
}
const conflict = this.items.find(i => i.sequence == newSequence)
if (conflict) {
this.showSnack([`Sequence ${newSequence} already exists`, "error"]);
} else {
// Cannot do this check at the moment as we would have to load the list of sequences.
// TODO We will do this after refactoring.
/*
if (this.sequences.find(i => i.sequence == newSequence)) {
this.showSnack([`Sequence ${newSequence} conflicts with a line that's already been acquired`, "warning"]);
}
*/
const url = `/project/${this.$route.params.project}/plan/${item.sequence}`;
const init = {
method: "PATCH",
headers: {"Content-Type": "application/json"},
body: {
sequence: newSequence,
name: null
} // Setting name to null causes it to be regenerated
}
await this.api([url, init]);
}
},
async shiftTimesAfter(item, delta) {
const pos = this.items.indexOf(item)+1;
if (pos != 0) {
const modifiedLines = this.items.slice(pos);
if (modifiedLines.length) {
modifiedLines.reverse();
for (const line of modifiedLines) {
const ts0 = new Date(line.ts0.valueOf() + delta);
const ts1 = new Date(line.ts1.valueOf() + delta);
const url = `/project/${this.$route.params.project}/plan/${line.sequence}`;
const init = {
method: "PATCH",
headers: {"Content-Type": "application/json"},
body: {ts1, ts0}
}
await this.api([url, init]);
}
}
} else {
console.warn("Item", item, "not found");
}
},
editLagAfter (item) {
const pos = this.items.indexOf(item)+1;
if (pos != 0) {
if (pos < this.items.length) {
// Not last item
this.editedItems = this.items.slice(pos);
} else {
}
} else {
console.warn("Item", item, "not found");
}
},
editItem (item, key, value) {
this.edit = {
sequence: item.sequence,

View File

@@ -83,13 +83,13 @@
small
:color="labels[label] && labels[label].view.colour"
:title="labels[label] && labels[label].view.description"
:close="label == 'QCAccepted'"
:close="writeaccess && label == 'QCAccepted'"
@click:close="unaccept(item)">
{{label}}
</v-chip>
<template v-if="!item.labels || !item.labels.includes('QCAccepted')">
<v-hover v-slot:default="{hover}">
<v-hover v-slot:default="{hover}" v-if="writeaccess">
<span v-if="item.children && item.children.length">
<v-btn
:class="{'text--disabled': !hover}"
@@ -226,7 +226,7 @@ export default {
return values;
},
...mapGetters(['loading'])
...mapGetters(['writeaccess', 'loading'])
},
watch: {

View File

@@ -21,18 +21,59 @@
<v-menu
v-model="contextMenuShow"
:close-on-content-click="false"
:position-x="contextMenuX"
:position-y="contextMenuY"
absolute
offset-y
>
<v-list dense v-if="contextMenuItem">
<v-list-item @click="addToPlan(false)">
<v-list-item @click="addToPlan(false); contextMenuShow=false" v-if="writeaccess">
<v-list-item-title>Reshoot</v-list-item-title>
</v-list-item>
<v-list-item @click="addToPlan(true)">
<v-list-item @click="addToPlan(true); contextMenuShow=false" v-if="writeaccess">
<v-list-item-title>Reshoot with overlap</v-list-item-title>
</v-list-item>
<v-list-group>
<template v-slot:activator>
<v-list-item-title>Download report</v-list-item-title>
</template>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${contextMenuItem.sequence}?mime=application%2Fvnd.seis%2Bjson&download`"
title="Download as a Multiseis-compatible Seis+JSON file."
@click="contextMenuShow=false"
>
<v-list-item-title>Seis+JSON</v-list-item-title>
</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${contextMenuItem.sequence}?mime=application%2Fgeo%2Bjson&download`"
title="Download as a QGIS-compatible GeoJSON file"
@click="contextMenuShow=false"
>
<v-list-item-title>GeoJSON</v-list-item-title>
</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${contextMenuItem.sequence}?mime=application%2Fjson&download`"
title="Download as a generic JSON file"
@click="contextMenuShow=false"
>
<v-list-item-title>JSON</v-list-item-title>
</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${contextMenuItem.sequence}?mime=text%2Fhtml&download`"
title="Download as an HTML formatted file"
@click="contextMenuShow=false"
>
<v-list-item-title>HTML</v-list-item-title>
</v-list-item>
<v-list-item
:href="`/api/project/${$route.params.project}/event/-/${contextMenuItem.sequence}?mime=application%2Fpdf&download`"
title="Download as a Portable Document File"
@click="contextMenuShow=false"
>
<v-list-item-title>PDF</v-list-item-title>
</v-list-item>
</v-list-group>
</v-list>
</v-menu>
@@ -61,26 +102,39 @@
<v-card outlined class="flex-grow-1">
<v-card-title>
Acquisition remarks
<v-btn v-if="edit && edit.sequence == item.sequence && edit.key == 'remarks'"
class="ml-3"
icon
small
title="Save edits"
:disabled="loading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
</v-btn>
<v-btn v-else-if="edit === null"
class="ml-3"
icon
small
title="Edit"
:disabled="loading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
</v-btn>
<template v-if="writeaccess">
<template v-if="edit && edit.sequence == item.sequence && edit.key == 'remarks'">
<v-btn
class="ml-3"
icon
small
title="Cancel edit"
:disabled="loading"
@click="edit.value = item.remarks; edit = null"
>
<v-icon small>mdi-close</v-icon>
</v-btn>
<v-btn v-if="edit.value != item.remarks"
icon
small
title="Save edits"
:disabled="loading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
</v-btn>
</template>
<v-btn v-else-if="edit === null"
class="ml-3"
icon
small
title="Edit"
:disabled="loading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
</v-btn>
</template>
</v-card-title>
<v-card-subtitle>
</v-card-subtitle>
@@ -100,26 +154,39 @@
<v-card outlined class="flex-grow-1" v-if="item.remarks_final !== null">
<v-card-title>
Processing remarks
<v-btn v-if="edit && edit.sequence == item.sequence && edit.key == 'remarks_final'"
class="ml-3"
icon
small
title="Save edits"
:disabled="loading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
</v-btn>
<v-btn v-else-if="edit === null"
class="ml-3"
icon
small
title="Edit"
:disabled="loading"
@click="editItem(item, 'remarks_final')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
</v-btn>
<template v-if="writeaccess">
<template v-if="edit && edit.sequence == item.sequence && edit.key == 'remarks_final'">
<v-btn
class="ml-3"
icon
small
title="Cancel edit"
:disabled="loading"
@click="edit.value = item.remarks_final; edit = null"
>
<v-icon small>mdi-close</v-icon>
</v-btn>
<v-btn v-if="edit.value != item.remarks_final"
icon
small
title="Save edits"
:disabled="loading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
</v-btn>
</template>
<v-btn v-else-if="edit === null"
class="ml-3"
icon
small
title="Edit"
:disabled="loading"
@click="editItem(item, 'remarks_final')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
</v-btn>
</template>
</v-card-title>
<v-card-subtitle>
</v-card-subtitle>
@@ -414,7 +481,7 @@ export default {
},
computed: {
...mapGetters(['user', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
watch: {

View File

@@ -161,7 +161,7 @@ app.map({
// post: [ mw.info.post ],
},
'/project/:project/meta/': {
put: [ mw.meta.put ],
put: [ mw.auth.access.write, mw.meta.put ],
},
'/project/:project/meta/:path(*)': {
// Path examples:
@@ -186,6 +186,14 @@ app.map({
get: [ mw.gis.navdata.get ]
}
},
'/info/': {
':path(*)': {
get: [ mw.info.get ],
put: [ mw.auth.access.write, mw.info.put ],
post: [ mw.auth.access.write, mw.info.post ],
delete: [ mw.auth.access.write, mw.info.delete ]
}
},
'/rss/': {
get: [ mw.rss.get ]
}

View File

@@ -1,11 +1,10 @@
const { event } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const geojson = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const events = await event.list(req.params.project, query);
const {events, sequences} = await prepare(req.params.project, query);
const response = {
type: "FeatureCollection",
features: events.filter(event => event.geometry).map(event => {
@@ -18,6 +17,17 @@ const geojson = async function (req, res, next) {
return feature;
})
};
if ("download" in query || "d" in query) {
const extension = "geojson";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
res.status(200).send(response);
next();
} catch (err) {

View File

@@ -1,18 +1,32 @@
const { event, sequence, configuration } = require('../../../../lib/db');
const { transform } = require('../../../../lib/sse');
const { configuration } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const render = require('../../../../lib/render');
// FIXME Refactor when able
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../etc/default/templates/sequence.html.njk");
const html = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const events = await event.list(req.params.project, query);
const sequences = await sequence.list(req.params.project, query);
const {events, sequences} = await prepare(req.params.project, query);
const seis = transform(events, sequences, {projectId: req.params.project});
const templates = await configuration.get(req.params.project, "sse/templates");
const template = templates[0].template;
const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
// console.log("TEMPLATE", template);
const response = await render(seis, template);
if ("download" in query || "d" in query) {
const extension = "html";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
res.status(200).send(response);
next();
} catch (err) {

View File

@@ -1,12 +1,22 @@
const { event } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const json = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const response = await event.list(req.params.project, query);
res.status(200).send(response);
const {events, sequences} = await prepare(req.params.project, query);
if ("download" in query || "d" in query) {
const extension = "json";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
res.status(200).send(events);
next();
} catch (err) {
next(err);

View File

@@ -1,11 +1,14 @@
const fs = require('fs/promises');
const Path = require('path');
const crypto = require('crypto');
const { event, sequence, configuration } = require('../../../../lib/db');
const { transform } = require('../../../../lib/sse');
const { configuration } = require('../../../../lib/db');
const { transform, prepare } = require('../../../../lib/sse');
const render = require('../../../../lib/render');
const { url2pdf } = require('../../../../lib/selenium');
// FIXME Refactor when able
const defaultTemplatePath = require('path').resolve(__dirname, "../../../../../../../etc/default/templates/sequence.html.njk");
function tmpname (tmpdir="/dev/shm") {
return Path.join(tmpdir, crypto.randomBytes(16).toString('hex')+".tmp");
}
@@ -15,17 +18,26 @@ const pdf = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const events = await event.list(req.params.project, query);
const sequences = await sequence.list(req.params.project, query);
const {events, sequences} = await prepare(req.params.project, query);
const seis = transform(events, sequences, {projectId: req.params.project});
const templates = await configuration.get(req.params.project, "sse/templates");
const template = templates[0].template;
const template = (await configuration.get(req.params.project, "sse/templates/0/template")) || defaultTemplatePath;
const html = await render(seis, template);
await fs.writeFile(fname, html);
const pdf = Buffer.from(await url2pdf("file://"+fname), "base64");
if ("download" in query || "d" in query) {
const extension = "pdf";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
res.status(200).send(pdf);
next();
} catch (err) {

View File

@@ -1,13 +1,22 @@
const { event, sequence } = require('../../../../lib/db');
const { transform } = require('../../../../lib/sse');
const { transform, prepare } = require('../../../../lib/sse');
const seis = async function (req, res, next) {
try {
const query = req.query;
query.sequence = req.params.sequence;
const events = await event.list(req.params.project, query);
const sequences = await sequence.list(req.params.project, query);
const {events, sequences} = await prepare(req.params.project, query);
const response = transform(events, sequences, {projectId: req.params.project});
if ("download" in query || "d" in query) {
const extension = "json";
// Get the sequence number(s) (more than one sequence can be selected)
const seqNums = query.sequence.split(";");
// If we've only been asked for a single sequence, get its line name
const lineName = (sequences.find(i => i.sequence == seqNums[0]) || {})?.meta?.lineName;
const filename = (seqNums.length == 1 && lineName)
? `${lineName}-NavLog.${extension}`
: `${req.params.project}-${query.sequence}.${extension}`;
res.set("Content-Disposition", `attachment; filename="${filename}"`);
}
res.status(200).send(response);
next();
} catch (err) {

View File

@@ -0,0 +1,14 @@
const { info } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
await info.delete(req.params.project, req.params.path);
res.status(204).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,16 @@
const { info } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
await info.post(req.params.project, req.params.path, payload);
res.status(201).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,16 @@
const { info } = require('../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
await info.put(req.params.project, req.params.path, payload);
res.status(201).send();
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,146 @@
const { schema2pid } = require('../../lib/db/connection');
const { event } = require('../../lib/db');
class DetectSOLEOL {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
async processQueue () {
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
const sequence = Number(cur._sequence);
try {
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus != "online" && cur.lineStatus == "online" && sequence) {
// console.log("TRANSITION TO ONLINE", prev, cur);
// Check if there are already FSP, FGSP events for this sequence
const projectId = await schema2pid(cur._schema);
const sequenceEvents = await event.list(projectId, {sequence});
const labels = ["FSP", "FGSP"].filter(l => !sequenceEvents.find(i => i.labels.includes(l)));
if (labels.includes("FSP")) {
// At this point labels contains either FSP only or FSP + FGSP,
// depending on whether a FGSP event has already been entered.
const remarks = `SEQ ${cur._sequence}, SOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: cur._point,
remarks,
labels
}
// console.log(projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
}
} else if (prev.lineStatus == "online" && cur.lineStatus != "online") {
// console.log("TRANSITION TO OFFLINE", prev, cur);
// Check if there are already LSP, LGSP events for this sequence
const projectId = await schema2pid(prev._schema);
const sequenceEvents = await event.list(projectId, {sequence});
const labels = ["LSP", "LGSP"].filter(l => !sequenceEvents.find(i => i.labels.includes(l)));
if (labels.includes("LSP")) {
// At this point labels contains either LSP only or LSP + LGSP,
// depending on whether a LGSP event has already been entered.
const remarks = `SEQ ${prev._sequence}, EOL ${prev.lineName}, BSP: ${(prev.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(prev.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: prev._point,
remarks,
labels
}
// console.log(projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
}
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
console.error("DetectSOLEOL Error")
console.log(err);
} finally {
cur.isPending = false;
}
}
}
async run (data) {
if (!data || data.channel !== "realtime") {
return;
}
if (!(data.payload && data.payload.new && data.payload.new.meta)) {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectSOLEOL.MAX_QUEUE_SIZE) {
this.queue.push({
isPending: this.queue.length,
_schema: meta._schema,
time: meta.time,
shot: meta.shot,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName,
speed: meta.speed,
waterDepth: meta.waterDepth
});
} else {
// FIXME Change to alert
console.error("DetectSOLEOL queue full at", this.queue.length);
}
this.processQueue();
}
}
module.exports = DetectSOLEOL;

View File

@@ -0,0 +1,12 @@
const Handlers = [
require('./detect-soleol')
];
function init () {
return Handlers.map(Handler => new Handler());
}
module.exports = {
Handlers,
init
}

View File

@@ -1,56 +1,21 @@
const { schema2pid } = require('../lib/db/connection');
const { listen } = require('../ws/db');
const { event } = require('../lib/db');
const channels = require('../lib/db/channels');
const handlers = require('./handlers').init();
function start () {
let prevPos = null;
listen(["realtime"], function (data) {
if (!(data.payload && data.payload.new && data.payload.new.meta)) {
console.log("Wrong event", data);
return;
listen(channels, async function (data) {
for (const handler of handlers) {
// NOTE: We are intentionally passing the same instance
// of the data to every handler. This means that earlier
// handlers could, in principle, modify the data to be
// consumed by latter ones, provided that they are
// synchronous (as otherwise, the completion order is
// undefined).
await handler.run(data);
}
const pos = data.payload.new.meta;
if (prevPos) {
if (pos.lineStatus == "online") {
if (prevPos.lineStatus != "online") {
// FIXME TODO Check if there are already FSP, FGSP events for this sequence
// Tag this as FSP/FGSP
const remarks = `SEQ ${pos._sequence}, SOL ${pos.lineName}, BSP: ${(pos.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(pos.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence: pos._sequence,
point: pos._point,
remarks,
labels: [ "FSP", "FGSP" ]
}
schema2pid(pos._schema).then(projectId => event.post(projectId, payload));
// console.log("post fsp", pos._schema);
}
} else {
if (prevPos.lineStatus == "online") {
// FIXME TODO Check if there are already LSP, LGSP events for this sequence
// Tag this as LSP/LGSP
const remarks = `SEQ ${prevPos._sequence}, EOL ${prevPos.lineName}, BSP: ${(prevPos.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(prevPos.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence: prevPos._sequence,
point: prevPos._point,
remarks,
labels: [ "LSP", "LGSP" ]
}
schema2pid(prevPos._schema).then(projectId => event.post(projectId, payload));
// console.log("post lsp", prevPos._schema);
}
}
}
prevPos = JSON.parse(JSON.stringify(pos));
});
console.log("Events manager started");
console.log("Events manager started.", handlers.length, "active handlers");
}
module.exports = { start }

View File

@@ -0,0 +1,13 @@
// This is the list of all channels for which the
// database issues notifications.
// NOTE: This needs to be kept up to date with
// database schema changes.
module.exports = [
"realtime", "event", "project",
"preplot_lines", "preplot_points",
"planned_lines",
"raw_lines", "raw_shots",
"final_lines", "final_shots", "info"
];

View File

@@ -18,7 +18,8 @@ async function get (projectId, path, opts = {}) {
: res.rows.map(r => r.data);
if (path) {
return path.split('/').reduce( (obj, idx) => obj[idx], config);
return path.split('/').filter(i => i !== "").reduce( (obj, idx) =>
typeof obj !== 'undefined' ? obj[idx] : undefined, config);
} else {
return config;
}

View File

@@ -32,7 +32,11 @@ async function setSurvey (projectId, client) {
if (!client) {
client = await pool.connect();
}
await client.query("CALL set_survey($1);", [projectId]);
if (projectId) {
await client.query("CALL set_survey($1);", [projectId]);
} else {
await client.query("SET search_path TO public;");
}
return client;
}

View File

@@ -27,7 +27,7 @@ async function list (projectId, opts = {}) {
const limit = Math.abs(Number(opts.itemsPerPage)) || null;
const filter = opts.sequence
? opts.sequence.includes(";")
? String(opts.sequence).includes(";")
? [ "sequence = ANY ( $1 )", [ opts.sequence.split(";") ] ]
: [ "sequence = $1", [ opts.sequence ] ]
: opts.date0

View File

@@ -31,6 +31,7 @@ async function insertSequenceEventLabels(event, client) {
FROM unnest($2::text[]) l (name)
INNER JOIN labels USING (name)
WHERE (data->'model'->'user')::boolean IS true
ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
`;
// console.log("insertSequenceEventLabels", text, event);

View File

@@ -51,7 +51,8 @@ async function updateSeqEventLabels (event, client) {
const text = `
INSERT INTO events_seq_labels (id, label)
SELECT $1, label FROM unnest($2::text[]) t (label);
SELECT $1, label FROM unnest($2::text[]) t (label)
ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
`;
return client.query(text, [event.id, event.labels]);

View File

@@ -19,7 +19,7 @@ async function lines (projectId, options = {}) {
FROM (
SELECT ST_AsGeoJSON(t.*) geojson
FROM (
SELECT line, incr, remarks, ST_Transform(geometry, 4326) geometry
SELECT line, incr, remarks, ntba, ST_Transform(geometry, 4326) geometry
FROM preplot_lines
WHERE
class = $7
@@ -62,7 +62,7 @@ async function points (projectId, options = {}) {
FROM (
SELECT ST_AsGeoJSON(t.*) geojson
FROM (
SELECT line, point, class, ST_Transform(geometry, 4326) geometry
SELECT line, point, class, ntba, ST_Transform(geometry, 4326) geometry
FROM preplot_points
WHERE
class = $7

View File

@@ -0,0 +1,28 @@
const { setSurvey, transaction } = require('../connection');
async function del (projectId, path, opts = {}) {
const client = await setSurvey(projectId);
const [key, ...jsonpath] = (path||"").split("/").filter(i => i.length);
try {
const text = jsonpath.length
? `
UPDATE info
SET value = value #- $2
WHERE key = $1;
`
: `
DELETE FROM info
WHERE key = $1;
`;
const values = jsonpath.length ? [key, jsonpath] : [key];
await client.query(text, values);
} catch (err) {
console.error("ERROR", err);
throw err;
} finally {
client.release();
}
}
module.exports = del;

View File

@@ -2,7 +2,7 @@ const { setSurvey } = require('../connection');
async function get (projectId, path, opts = {}) {
const client = await setSurvey(projectId);
const [key, ...subkey] = path.split("/");
const [key, ...subkey] = path.split("/").filter(i => i.trim().length);
const text = `
SELECT value
@@ -17,7 +17,7 @@ async function get (projectId, path, opts = {}) {
if (subkey.length) {
const res = subkey.reduce( (obj, idx) => typeof obj != "undefined" ? obj[idx] : obj, value);
console.log(res);
//console.log(res);
return res;
} else {
return value;

View File

@@ -0,0 +1,41 @@
const { setSurvey, transaction } = require('../connection');
async function post (projectId, path, payload, opts = {}) {
const client = await setSurvey(projectId);
const [key, ...jsonpath] = (path||"").split("/").filter(i => i.length);
try {
const text = jsonpath.length
? `
INSERT INTO info (key, value)
VALUES ($2, jsonb_insert('${isNaN(Number(jsonpath[0])) ? "{}" : "[]"}'::jsonb, $3, $1))
ON CONFLICT (key) DO UPDATE
SET
key = $2,
value = jsonb_insert((SELECT value FROM info WHERE key = $2), $3, $1, true)
RETURNING *;
`
: `
INSERT INTO info (key, value)
VALUES ($2, jsonb_insert('[]'::jsonb, '{0}', $1))
ON CONFLICT (key) DO UPDATE
SET
key = $2,
value = jsonb_insert((SELECT value FROM info WHERE key = $2), '{-1}'::text[], $1, true)
RETURNING *;
`;
const values = jsonpath.length ? [JSON.stringify(payload), key, jsonpath] : [JSON.stringify(payload), key];
await client.query(text, values);
} catch (err) {
console.error("ERROR", err);
if (err.code == 22023) {
throw {status: 400, message: "Cannot post to non-array"};
} else {
throw err;
}
} finally {
client.release();
}
}
module.exports = post;

View File

@@ -0,0 +1,37 @@
const { setSurvey, transaction } = require('../connection');
async function put (projectId, path, payload, opts = {}) {
const client = await setSurvey(projectId);
const [key, ...jsonpath] = (path||"").split("/").filter(i => i.length);
try {
const text = jsonpath.length
? `
INSERT INTO info (key, value)
VALUES ($2, jsonb_set('${isNaN(Number(jsonpath[0])) ? "{}" : "[]"}'::jsonb, $3, $1))
ON CONFLICT (key) DO UPDATE
SET
key = $2,
value = jsonb_set((SELECT value FROM info WHERE key = $2), $3, $1)
RETURNING *;
`
: `
INSERT INTO info (key, value)
VALUES ($2, $1)
ON CONFLICT (key) DO UPDATE
SET
key = $2,
value = $1
RETURNING *;
`;
const values = jsonpath.length ? [JSON.stringify(payload), key, jsonpath] : [JSON.stringify(payload), key];
await client.query(text, values);
} catch (err) {
console.error("ERROR", err);
throw err;
} finally {
client.release();
}
}
module.exports = put;

View File

@@ -10,44 +10,23 @@ async function list (projectId, opts = {}) {
const limit = Math.abs(Number(opts.itemsPerPage)) || null;
const text = `
WITH summary AS (
SELECT DISTINCT
line,
CASE
WHEN pl.incr THEN
first_value(point) OVER w
ELSE
last_value(point) OVER w
END fsp,
CASE
WHEN pl.incr THEN
last_value(point) OVER w
ELSE
first_value(point) OVER w
END lsp,
count(point) OVER w num_points,
-- ST_MakeLine(first_value(geometry) OVER w, last_value(geometry) over w) geometry,
ST_Distance(first_value(pp.geometry) OVER w, last_value(pp.geometry) over w) length,
CASE
WHEN pl.incr THEN
ST_Azimuth(first_value(pp.geometry) OVER w, last_value(pp.geometry) over w)*180/pi()
ELSE
ST_Azimuth(last_value(pp.geometry) OVER w, first_value(pp.geometry) over w)*180/pi()
END azimuth
FROM preplot_points pp
INNER JOIN preplot_lines pl USING (line)
WHERE pp.class = 'V'
WINDOW w AS (
PARTITION BY line
ORDER BY point ASC
ROWS BETWEEN
UNBOUNDED PRECEDING
AND UNBOUNDED FOLLOWING
)
WITH counts AS (
SELECT pls.*, COALESCE(ppc.virgin, 0) na, COALESCE(ppc.tba, 0) tba
FROM preplot_lines_summary pls
LEFT JOIN (
SELECT line, COUNT(*) virgin, COUNT(NULLIF(ntba,true)) tba
FROM preplot_points_count pc
INNER JOIN preplot_points pp
USING (line, point)
WHERE pc.count = 0
GROUP BY line
) ppc
USING (line)
)
SELECT s.*, incr, remarks, pl.ntba, pl.meta
FROM summary s
SELECT s.*, pl.ntba, c.na, c.tba, pl.meta
FROM preplot_lines_summary s
INNER JOIN preplot_lines pl ON pl.class = 'V' AND s.line = pl.line
LEFT JOIN counts c ON s.line = c.line
ORDER BY ${sortKey} ${sortDir}
OFFSET $1
LIMIT $2;

View File

@@ -7,6 +7,9 @@ async function patch (projectId, line, payload, opts = {}) {
"remarks": "UPDATE preplot_lines SET remarks = $2 WHERE line = $1 AND class ='V';",
"meta": "UPDATE preplot_lines SET meta = $2 WHERE line = $1 AND class ='V';",
"ntba": "UPDATE preplot_lines SET ntba = $2 WHERE line = $1 AND class ='V';",
"complete": "UPDATE preplot_points pp SET ntba = $2 FROM preplot_points_count ppc WHERE pp.line = ppc.line AND pp.point = ppc.point AND pp.line = $1 AND ($2 = false OR ppc.count = 0);"
// NOTE on the "complete" query: if complete is true it sets *only* virgin points to NTBA=true,
// but if complete is false it sets all points on the line to NTBA=false.
};
try {

View File

@@ -2,6 +2,8 @@
const { setSurvey, transaction, pool } = require('../connection');
let last_tstamp = 0;
async function getAllProjectConfigs () {
const client = await pool.connect();
@@ -64,6 +66,50 @@ async function getNearestPreplot (candidates) {
return res.rows[0] && res.rows[0].schema;
}
async function getNearestOfflinePreplot (candidates) {
const queries = candidates.map( c=> {
let text, values;
if ("latitude" in candidates[0] && "longitude" in candidates[0]) {
text = `
SELECT
'${c._schema}' AS _schema,
ST_Distance(ST_Transform(ST_SetSRID(ST_MakePoint($1, $2), 4326), ST_SRID(geometry)), geometry) AS distance
FROM ${c._schema}.preplot_points
ORDER BY distance ASC
LIMIT 1;
`;
values = [ candidates[0].longitude, candidates[0].latitude ];
} else if ("easting" in candidates[0] && "northing" in candidates[0]) {
text = `
SELECT
'${c._schema}' AS _schema,
ST_Distance(ST_SetSRID(ST_MakePoint($1, $2), ST_SRID(geometry)), geometry) AS distance
FROM ${c._schema}.preplot_points
ORDER BY distance ASC
LIMIT 1;
`;
values = [ candidates[0].easting, candidates[0].northing ];
} else {
// Missing a position, shouldn't happen at this point
return {};
}
return {text, values};
}).filter(i => i.text && i.values);
const client = await pool.connect();
const results = [];
for (const qry of queries) {
const res = await client.query(qry.text, qry.values);
if (res.rows[0] && res.rows[0]._schema) {
results.push(res.rows[0]);
}
}
client.release();
const _schema = results.sort( (a, b) => a.distance - b.distance).shift()._schema;
return candidates.find(c => c._schema == _schema);
}
async function saveOnline (dataset, opts = {}) {
const client = await pool.connect();
@@ -236,6 +282,23 @@ async function save (navData, opts = {}) {
navData.payload._point = candidates[0].point;
navData.payload._online = true;
}
} else {
// We are offline. We only assign _schema once every save_interval seconds at most
if (opts.offline_survey_heuristics == "nearest_preplot") {
const now = Date.now();
const do_save = !opts.offline_survey_detect_interval ||
(now - last_tstamp) >= opts.offline_survey_detect_interval;
if (do_save) {
const configs = await getAllProjectConfigs();
const candidates = configs.map(c => Object.assign({}, navData, {_schema: c.schema}));
const bestCandidate = await getNearestOfflinePreplot(candidates);
if (bestCandidate) {
navData.payload._schema = bestCandidate._schema;
last_tstamp = now;
}
}
}
}
await saveOffline(navData, opts);

View File

@@ -3,42 +3,221 @@ const { getLineName } = require('./lib');
async function patch (projectId, sequence, payload, opts = {}) {
const client = await setSurvey(projectId);
sequence = Number(sequence);
/*
* Takes a Date object and returns the epoch
* in seconds
*/
function epoch (ts) {
return Number(ts)/1000;
}
/*
* Shift sequence ts0, ts1 by dt0, dt1 respectively
* for only one sequence
*/
async function shiftSequence (sequence, dt0, dt1) {
const text = `
UPDATE planned_lines
SET ts0 = ts0 + make_interval(secs => $2), ts1 = ts1 + make_interval(secs => $3)
WHERE sequence = $1
`;
return await client.query(text, [sequence, dt0, dt1]);
}
/*
* Shift sequence ts0, ts1 by dt0, dt1 respectively
* for all sequences >= sequence
*/
async function shiftSequences (sequence, dt0, dt1) {
const text = `
UPDATE planned_lines
SET ts0 = ts0 + make_interval(secs => $2), ts1 = ts1 + make_interval(secs => $3)
WHERE sequence >= $1
`;
return await client.query(text, [sequence, dt0, dt1]);
}
try {
transaction.begin(client);
let deltatime;
const r0 = await client.query("SELECT * FROM planned_lines_summary WHERE sequence >= $1 ORDER BY sequence ASC LIMIT 2;", [sequence]);
const seq = (r0?.rows || [])[0];
if (!seq || seq?.sequence != sequence) {
throw {status: 400, message: `Sequence ${sequence} does not exist`};
}
const seq1 = r0.rows[1];
const speed = seq.length/(epoch(seq.ts1)-epoch(seq.ts0)); // m/s
if ("ts0" in payload || "ts1" in payload) {
/*
* Change in start or end times
*/
deltatime = "ts0" in payload
? (epoch(new Date(payload.ts0)) - epoch(seq.ts0))
: (epoch(new Date(payload.ts1)) - epoch(seq.ts1));
// Now shift all sequences >= this one by deltatime
await shiftSequences(sequence, deltatime, deltatime);
} else if ("speed" in payload) {
/*
* Change in acquisition speed (m/s)
*/
// Check that speed is sensible
if (payload.speed < 0.1) {
throw {status: 400, message: "Speed must be at least 0.1 m/s"};
}
deltatime = epoch(seq.ts0) + (seq.length/payload.speed) - epoch(seq.ts1);
// Fix seq.ts0, shift set.ts1 += deltatime, plus all sequences > this one
await shiftSequence(sequence, 0, deltatime);
await shiftSequences(sequence+1, deltatime, deltatime);
} else if ("fsp" in payload) {
/*
* Change of FSP
*/
// Keep ts1, adjust fsp and ts0 according to speed
// ts0' = (shot_distance * delta_shots / speed) + ts0
const sign = Math.sign(seq.lsp-seq.fsp);
const ts0 = (sign * (seq.length/seq.num_points) * (payload.fsp-seq.fsp) / speed) + epoch(seq.ts0);
const text = `
UPDATE planned_lines
SET fsp = $2, ts0 = $3
WHERE sequence = $1;
`;
await client.query(text, [sequence, payload.fsp, new Date(ts0*1000)]);
} else if ("lsp" in payload) {
/*
* Change of LSP
*/
// Keep ts0, adjust lsp and ts1 according to speed
// Calculate deltatime from ts1'-ts1
// Shift all sequences > this one by deltatime
// deltatime = (shot_distance * delta_shots / speed)
// ts1' = deltatime + ts1
const sign = Math.sign(seq.lsp-seq.fsp);
deltatime = (sign * (seq.length/seq.num_points) * (payload.lsp-seq.lsp) / speed);
const ts1 = deltatime + epoch(seq.ts1);
const text = `
UPDATE planned_lines
SET lsp = $2, ts1 = $3
WHERE sequence = $1;
`;
await client.query(text, [sequence, payload.lsp, new Date(ts1*1000)]);
shiftSequences(sequence+1, deltatime, deltatime);
} else if ("lagAfter" in payload && seq1) {
/*
* Change of line change time
*/
// Check that the value is sensible
if (payload.lagAfter < 0) {
throw {status: 400, message: "Line change time cannot be negative"};
}
const text = `
UPDATE planned_lines
SET
sequence = COALESCE($2, sequence),
fsp = COALESCE($3, fsp),
lsp = COALESCE($4, lsp),
ts0 = COALESCE($5, ts0),
ts1 = COALESCE($6, ts1),
name = COALESCE($7, name),
remarks = COALESCE($8, remarks),
meta = COALESCE($9, meta)
WHERE sequence = $1;
`
// Calculate deltatime from next sequence's ts0'-ts0
// Shift all sequences > this one by deltatime
const ts0 = epoch(seq.ts1) + payload.lagAfter; // lagAfter is in seconds
deltatime = ts0 - epoch(seq1.ts0);
shiftSequences(sequence+1, deltatime, deltatime);
} else if ("sequence" in payload) {
/*
* Renumbering / reshuffling of sequences
*/
// NOTE: This does not enforce consecutive sequences, because sometimes
// there is a need for those (don't ask).
// Renumber or reorder sequences
const r1 = await client.query("SELECT sequence FROM planned_lines ORDER BY sequence;");
const sequences = (r1?.rows||[]).map(i => i.sequence);
const index = sequences.indexOf(payload.sequence);
if (index != -1) {
// Make space by shifting all sequence numbers >= payload.sequence by 1
const text = `
UPDATE planned_lines
SET sequence = sequence + 1
WHERE sequence >= $1;
`;
await client.query("SET CONSTRAINTS planned_lines_pkey DEFERRED;");
await client.query(text, [payload.sequence]);
// And now we need to rename all affected lines
const r2 = await client.query("SELECT * FROM planned_lines WHERE sequence > $1 ORDER BY sequence;", [payload.sequence]);
for (let row in r2.rows) {
const name = await getLineName(client, projectId, row);
await client.query("UPDATE planned_lines SET name = $2 WHERE sequence = $1", [row.sequence, name]);
}
}
// Now update just this sequence
const text = `
UPDATE planned_lines
SET sequence = $2
WHERE sequence = $1;
`;
await client.query(text, [sequence, payload.sequence]);
// And rename
const r3 = await client.query("SELECT * FROM planned_lines WHERE sequence = $1 ORDER BY sequence;", [payload.sequence]);
const name = await getLineName(client, projectId, r3.rows[0]);
await client.query("UPDATE planned_lines SET name = $2 WHERE sequence = $1", [payload.sequence, name]);
const p = payload; // For short
const values = [ sequence, p.sequence, p.fsp, p.lsp, p.ts0, p.ts1, p.name, p.remarks, p.meta ];
await client.query(text, values);
// Magic if the name is (strictly) null or empty, we generate a new one
if (p.name === null || p.name === "") {
const text = "SELECT * FROM planned_lines WHERE sequence = $1;";
const res = await client.query(text, [p.sequence||sequence]);
const row = res.rows[0];
const name = await getLineName(client, projectId, row);
await client.query("UPDATE planned_lines SET name = $2 WHERE sequence = $1", [p.sequence||sequence, name]);
} else if (["name", "remarks", "meta"].some(i => i in payload)) {
/*
* Change in various other attributes that do not affect
* other sequences
*/
// NOTE Magic! If name is empty, we generate one.
// Can be used for going back to a default name after it's been
// changed manually.
if (payload.name === "") {
payload.name = await getLineName(client, projectId, r0.rows[0]);
}
// Change the relevant attribute
const text = `
UPDATE planned_lines
SET
name = COALESCE($2, name),
remarks = COALESCE($3, remarks),
meta = COALESCE($4, meta)
WHERE sequence = $1;
`;
await client.query(text, [sequence, payload.name, payload.remarks, payload.meta]);
} else {
throw { status: 400, message: "Bad request"};
}
transaction.commit(client);
} catch (err) {
transaction.rollback(client);
throw err;
if (err.code == 23503) {
if (err.constraint == "planned_lines_line_fsp_class_fkey" || err.constraint == "planned_lines_line_lsp_class_fkey") {
throw {status: 400, message: "Attempt to shoot a non-existent shotpoint"};
}
} else {
throw err;
}
} finally {
client.release();
}

View File

@@ -13,17 +13,39 @@ function njkFind (ary, key, value) {
}
}
function njkCollect (entries, key, collectables) {
const out = [];
for (const entry of entries) {
if (out.find(i => i[key] == entry[key])) {
continue;
}
const others = entries.filter(item => item[key] == entry[key]);
const obj = Object.assign({}, entry);
for (const collectable of collectables) {
obj[collectable] = others.map(i => i[collectable]).filter(i => typeof i !== "undefined" && i !== "");
}
out.push(obj);
}
return out;
}
function njkUnique (entries) {
return entries.filter((element, index, array) => array.indexOf(element) === index);
}
function njkPadStart (str, len, chr) {
return String(str).padStart(len, chr);
}
function njkTimestamp (arg) {
if (typeof arg.toISOString === "function") {
return arg.toISOString();
}
const ts = new Date(arg);
if (!isNaN(ts)) {
return ts.toISOString();
if (arg) {
if (typeof arg.toISOString === "function") {
return arg.toISOString();
}
const ts = new Date(arg);
if (!isNaN(ts)) {
return ts.toISOString();
}
}
return arg;
}
@@ -40,6 +62,8 @@ async function render (data, template) {
const nenv = nunjucks.configure(Path.dirname(template), {autoescape: false, lstripBlocks: false, trimBlocks: false});
nenv.addFilter('find', njkFind);
nenv.addFilter('unique', njkUnique);
nenv.addFilter('collect', njkCollect);
nenv.addFilter('padStart', njkPadStart);
nenv.addFilter('timestamp', njkTimestamp);
nenv.addFilter('markdown', njkMarkdown);

View File

@@ -1,3 +1,4 @@
module.exports = {
transform: require('./transform')
transform: require('./transform'),
prepare: require('./prepare')
}

View File

@@ -0,0 +1,16 @@
const { event, sequence, info } = require('../db');
async function prepare (project, query) {
const events = await event.list(project, query);
const sequences = await sequence.list(project, query);
const equipment = await info.get(null, "equipment");
for (const sequence of sequences) {
const maxTstamp = sequence.ts1_final || sequence.ts1 || +Infinity;
if (equipment) {
sequence.equipment = equipment.filter(i => new Date(i.tstamp) <= maxTstamp);
}
}
return {events, sequences};
}
module.exports = prepare;

View File

@@ -28,13 +28,17 @@ function transform (events, sequences, opts = {}) {
SequenceObject = {
SequenceNumber,
Entries: [],
DglSailline: sequence.line,
DglNumPoints: sequence.num_points,
DglNumMissing: sequence.missing_shots,
// NOTE: Distance & azimuth refer to raw data if the sequence
// status is 'raw' and to final data if status is 'final'. In
// the event of it being NTBP it depends on whether final data
// exists or not.
DglLength: sequence.length
DglLength: sequence.length,
DglAzimuth: sequence.azimuth,
DglDuration: sequence.duration_final || sequence.duration,
DglEquipmentInfo: sequence.equipment
};
[sequence.remarks, sequence.remarks_final].filter(i => !!i).forEach(i => {
if (!SequenceObject.DglSequenceComments) {

View File

@@ -286,6 +286,12 @@ components:
num_points:
type: integer
description: Number of points in this line.
na:
type: integer
description: Number of points in this line which have not been acquired and processed at least once (virgin points).
tba:
type: integer
description: Number of virgin points in this line which do not have their `ntba` flag set.
length:
type: number
description: Length of the line in metres.
@@ -790,6 +796,9 @@ paths:
ntba:
type: boolean
description: Set the Not To Be Processed flag to `true` or `false`.
complete:
type: boolean
description: If `true`, set the Not To Be Processed flag to `true` on any points in the line which have not yet been successfully acquired and processed at least once (virgin points). If `false`, set the NTBA flag to `false` on *all* points in this line.
responses:
"204":

View File

@@ -5,6 +5,7 @@ var client;
const channels = {};
async function notify (data) {
if (data.channel in channels) {
data._received = new Date();
try {
@@ -14,7 +15,7 @@ async function notify (data) {
// Ignore the error
}
for (const listener of channels[data.channel]) {
listener(JSON.parse(JSON.stringify(data)));
await listener(JSON.parse(JSON.stringify(data)));
}
}
}
@@ -39,10 +40,10 @@ async function listen (addChannels, callback) {
return;
}
client.on('notification', notify);
console.log("Client connected", Object.keys(channels));
console.log("Websocket client connected", Object.keys(channels));
client.on('error', (err) => console.error("Events client error: ", err));
client.on('end', () => {
console.warn("Events client disconnected. Will attempt to reconnect in five seconds");
console.warn("Websocket events client disconnected. Will attempt to reconnect in five seconds");
setImmediate(() => client = null);
setTimeout(reconnect, 5000);
});

View File

@@ -1,17 +1,10 @@
const ws = require('ws');
const URL = require('url');
const db = require('./db');
const channels = require('../lib/db/channels');
function start (server, pingInterval=30000) {
const channels = [
"realtime", "event", "project",
"preplot_lines", "preplot_points",
"planned_lines",
"raw_lines", "raw_shots",
"final_lines", "final_shots"
];
const wsServer = new ws.Server({ noServer: true });
wsServer.on('connection', socket => {
socket.alive = true;