Compare commits

..

56 Commits

Author SHA1 Message Date
D. Berge
4c2a2617a1 Adapt Project component to Vuex use for fetching data.
The Project component is now responsible for fetching and
updating the data used by most project tabs, with the
exception of ProjectSummary, QC, Graphs and Map. It is
also the only one listening for server events and reacting
to them.

Individual tabs are still responsible for sending data to
the server, at least for the time being.
2023-10-25 16:19:18 +02:00
D. Berge
5021888d03 Adapt Log component to Vuex use for fetching data 2023-10-25 16:18:41 +02:00
D. Berge
bf633f7fdf Refactor Calendar component.
- adapts it to Vuex use for fetching data
- displays extra events in 4-day and day views
- allows classifying by event label in 4-day and day views
2023-10-25 16:16:01 +02:00
D. Berge
847f49ad7c Adapt SequenceList component to Vuex use for fetching data 2023-10-25 16:15:17 +02:00
D. Berge
171feb9dd2 Adapt Plan component to Vuex use for fetching data 2023-10-25 16:14:45 +02:00
D. Berge
503a0de12f Adapt LineList component to Vuex use for fetching data 2023-10-25 16:13:56 +02:00
D. Berge
cf89a43f64 Add project configuration to Vuex store 2023-10-25 16:11:24 +02:00
D. Berge
680e376ed1 Add Vuex sequence module 2023-10-25 16:11:24 +02:00
D. Berge
a26974670a Add Vuex plan module 2023-10-25 16:11:24 +02:00
D. Berge
16a6cb59dc Add Vuex line module 2023-10-25 16:11:24 +02:00
D. Berge
829e206831 Add Vuex label module 2023-10-25 09:59:04 +02:00
D. Berge
83244fcd1a Add Vuex event module 2023-10-25 09:51:28 +02:00
D. Berge
851369a0b4 Invalidate planner endpoint cache when setting remarks 2023-10-23 14:58:41 +02:00
D. Berge
5065d62443 Update planner endpoint documentation 2023-10-23 14:57:27 +02:00
D. Berge
2d1e1e9532 Modify return payload of planner endpoint.
Previous:

[
  { sequence: …},
  { sequence: …},
  …
]

Current:

{
  remarks: "…",
  sequences: [
    { sequence: …},
    { sequence: …},
    …
  ]
}
2023-10-23 14:53:32 +02:00
D. Berge
051049581a Merge branch '278-rewrite-events-queue' into 'devel'
Resolve "Rewrite events queue"

Closes #278

See merge request wgp/dougal/software!46
2023-10-17 10:28:21 +00:00
D. Berge
da5ae18b0b Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into devel 2023-10-17 12:27:31 +02:00
D. Berge
ac9353c101 Add database upgrade file 31. 2023-10-17 12:27:06 +02:00
D. Berge
c4c5c44bf1 Add comment 2023-10-17 12:20:19 +02:00
D. Berge
d3659ebf02 Merge branch '269-support-requesting-a-partial-update-from-the-events-log-endpoint' into 'devel'
Resolve "Support requesting a partial update from the events log endpoint"

Closes #269

See merge request wgp/dougal/software!47
2023-10-17 10:18:41 +00:00
D. Berge
6b5070e634 Add event changes API endpoint description 2023-10-17 12:15:41 +02:00
D. Berge
09ff96ceee Add events change API endpoint 2023-10-17 11:15:36 +02:00
D. Berge
f231acf109 Add events change middleware 2023-10-17 11:15:06 +02:00
D. Berge
e576e1662c Add library function returning event changes after given epoch 2023-10-17 11:13:58 +02:00
D. Berge
6a21ddd1cd Rewrite events listener and handlers.
The events listener now uses a proper self-consuming queue and
the event handlers have been rewritten accordingly.

The way this works is that running init() on the handlers
library instantiates the handlers and returns two higher-order
functions, prepare() and despatch(). A call to the latter of
these is appended to the queue with each new incoming event.

The handlers have access to a context object (ctx) which may be
used to persist data between calls and/or exchange data between
handlers. This is used notably to give the handlers access to
project configurations, which are themselves refreshed by a
project configuration change handler (DetectProjectConfigurationChange).
2023-10-14 20:53:42 +02:00
D. Berge
c1e35b2459 Cache project configuration details.
This avoids requesting the project configurations on every single
incoming message. A listener refreshes the data on configuration
changes.
2023-10-14 20:11:18 +02:00
D. Berge
eee2a96029 Modify logging statements 2023-10-14 20:10:46 +02:00
D. Berge
6f5e5a4d20 Fix bug for shortcut when there is only one candidate project 2023-10-14 20:09:07 +02:00
D. Berge
9e73cb7e00 Clean up on SIGINT, SIGHUP signals 2023-10-14 20:07:19 +02:00
D. Berge
d7ab4eec7c Run some tasks periodically from the main process.
This reduces reliance on crontab jobs.
2023-10-14 20:06:38 +02:00
D. Berge
cdd96a4bc7 Don't bother trying to kill the child process on exit.
As the exit signal handler does not allow asynchronous tasks and
besides, killing the parent should kill its children too.
2023-10-14 20:02:54 +02:00
D. Berge
39a21766b6 Exit on start up errors 2023-10-14 20:02:04 +02:00
D. Berge
0e33c18b5c Replace console.log() with debug library calls 2023-10-14 19:57:57 +02:00
D. Berge
7f411ac7dd Add queue libraries.
A basic queue implementation and one that consumes its items
automatically until empty.
2023-10-14 19:56:56 +02:00
D. Berge
ed1da11c9d Add helper function to purge notifications 2023-10-14 19:54:34 +02:00
D. Berge
66ec28dd83 Refactor DB notifications listener to support large payloads.
The listener will automatically retrieve the full payload
before passing it on to event handlers.
2023-10-14 18:33:41 +02:00
D. Berge
b928d96774 Add database upgrade file 30. 2023-10-14 18:29:28 +02:00
D. Berge
73335f9c1e Merge branch '136-add-line-change-time-log-pseudoevent' into 'devel'
Resolve "Add line change time log pseudoevent"

Closes #136

See merge request wgp/dougal/software!45
2023-10-04 12:50:49 +00:00
D. Berge
7b6b81dbc5 Add more debugging statements 2023-10-04 14:50:12 +02:00
D. Berge
2e11c574c2 Throw rather than return.
Otherwise the finally {} block won't run.
2023-10-04 14:49:35 +02:00
D. Berge
d07565807c Do not retry immediately 2023-10-04 14:49:09 +02:00
D. Berge
6eccbf215a There should be no need to await.
That is because the queue handler will, in theory, only ever
process one event at a time.
2023-09-30 21:29:15 +02:00
D. Berge
8abc05f04e Remove dead code 2023-09-30 21:29:15 +02:00
D. Berge
8f587467f9 Add comment 2023-09-30 21:29:15 +02:00
D. Berge
3d7a91c7ff Rewrite ReportLineChangeTime 2023-09-30 21:29:15 +02:00
D. Berge
3fd408074c Support passing array in opts.sequences to event.list() 2023-09-30 21:29:15 +02:00
D. Berge
f71cbd8f51 Add unique utility function 2023-09-30 21:29:15 +02:00
D. Berge
915df8ac16 Add handler for creation of line change time events 2023-09-30 21:29:15 +02:00
D. Berge
d5ecb08a2d Allow switching to event entry by time.
A ‘Timed’ button is shown when a new (not edited) event is in
the event entry dialogue and the event has sequence and/or
point values. Pressing the button deletes the sequence/point
information and sets the date and time fields to current time.

Fixes #277.
2023-09-30 21:26:32 +02:00
D. Berge
9388cd4861 Make daily_tasks work with new project configuration 2023-09-30 20:36:46 +02:00
D. Berge
180590b411 Mark events as being automatically generated 2023-09-30 01:42:27 +02:00
D. Berge
4ec37539bf Add utils to work with Postgres ranges 2023-09-30 01:41:45 +02:00
D. Berge
8755fe01b6 Refactor events.list.
The SQL has been simplified and the following changes made:

- The `sequence` argument now can only take one individual
  sequence, not a list of sequences.
- A new `sequences` argument is recognised. It takes a list
  of sequences (as a string).
- A new `label` argument is recognised. It takes a label
  name and returns events containing that label.
- A new `jpq` argument is recognised. It takes a JSONPath
  string which is applied to `meta` with jsonb_path_exists(),
  returning any events for which the JSON path expression
  matches.
2023-09-30 01:37:22 +02:00
D. Berge
0bfe54e0c2 Include the meta attribute when posting events 2023-09-30 01:36:18 +02:00
D. Berge
29bc689b84 Merge branch '276-add-soft-start-event-detection' into 'devel'
Resolve "Add soft start event detection"

Closes #276

See merge request wgp/dougal/software!44
2023-09-29 15:02:57 +00:00
D. Berge
d408665d62 Write meta info to automatic events 2023-09-29 16:49:27 +02:00
71 changed files with 2876 additions and 788 deletions

View File

@@ -11,11 +11,9 @@ from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
surveys = db.surveys()
print("Reading surveys")
for survey in surveys:

View File

@@ -0,0 +1,164 @@
-- Support notification payloads larger than Postgres' NOTIFY limit.
--
-- New schema version: 0.4.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema only.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new table where large notification payloads are stored
-- temporarily and from which they might be recalled by the notification
-- listeners. It also creates a purge_notifications() procedure used to
-- clean up old notifications from the notifications log and finally,
-- modifies notify() to support these changes. When a large payload is
-- encountered, the payload is stored in the notify_payloads table and
-- a trimmed down version containing a notification_id is sent to listeners
-- instead. Listeners can then query notify_payloads to retrieve the full
-- payloads. It is the application layer's responsibility to delete old
-- notifications.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_schema () AS $outer$
BEGIN
RAISE NOTICE 'Updating public schema';
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO public');
CREATE TABLE IF NOT EXISTS public.notify_payloads (
id SERIAL,
tstamp timestamptz NOT NULL DEFAULT CURRENT_TIMESTAMP,
payload text NOT NULL DEFAULT '',
PRIMARY KEY (id)
);
CREATE INDEX IF NOT EXISTS notify_payload_tstamp ON notify_payloads (tstamp);
CREATE OR REPLACE FUNCTION public.notify() RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE
channel text := TG_ARGV[0];
pid text;
payload text;
notification text;
payload_id integer;
BEGIN
SELECT projects.pid INTO pid FROM projects WHERE schema = TG_TABLE_SCHEMA;
payload := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'old', row_to_json(OLD),
'new', row_to_json(NEW),
'pid', pid
)::text;
IF octet_length(payload) < 1000 THEN
PERFORM pg_notify(channel, payload);
ELSE
-- We need to find another solution
-- FIXME Consider storing the payload in a temporary memory table,
-- referenced by some form of autogenerated ID. Then send the ID
-- as the payload and then it's up to the user to fetch the original
-- payload if interested. This needs a mechanism to expire older payloads
-- in the interest of conserving memory.
INSERT INTO notify_payloads (payload) VALUES (payload) RETURNING id INTO payload_id;
notification := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'pid', pid,
'payload_id', payload_id
)::text;
PERFORM pg_notify(channel, notification);
RAISE INFO 'Payload over limit';
END IF;
RETURN NULL;
END;
$$;
CREATE PROCEDURE public.purge_notifications (age_seconds numeric DEFAULT 120) AS $$
DELETE FROM notify_payloads WHERE EXTRACT(epoch FROM CURRENT_TIMESTAMP - tstamp) > age_seconds;
$$ LANGUAGE sql;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.2' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
-- This upgrade modified the `public` schema only, not individual
-- project schemas.
CALL pg_temp.upgrade_schema();
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_schema ();
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.3"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,104 @@
-- Add event_log_changes function
--
-- New schema version: 0.4.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds a function event_log_changes which returns the subset of
-- events from event_log_full which have been modified on or after a
-- given timestamp.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE pg_temp.show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $outer$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_changes(ts0 timestamptz)
RETURNS SETOF event_log_full
LANGUAGE sql
AS $$
SELECT *
FROM event_log_full
WHERE lower(validity) > ts0 OR upper(validity) IS NOT NULL AND upper(validity) > ts0
ORDER BY lower(validity);
$$;
END;
$outer$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade () AS $outer$
DECLARE
row RECORD;
current_db_version TEXT;
BEGIN
SELECT value->>'db_schema' INTO current_db_version FROM public.info WHERE key = 'version';
IF current_db_version >= '0.4.4' THEN
RAISE EXCEPTION
USING MESSAGE='Patch already applied';
END IF;
IF current_db_version != '0.4.3' THEN
RAISE EXCEPTION
USING MESSAGE='Invalid database version: ' || current_db_version,
HINT='Ensure all previous patches have been applied.';
END IF;
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$outer$ LANGUAGE plpgsql;
CALL pg_temp.upgrade();
CALL pg_temp.show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade ();
CALL pg_temp.show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.4.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.4.4"}' WHERE public.info.key = 'version';
CALL pg_temp.show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE pg_temp.show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -44,7 +44,7 @@
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
@@ -64,7 +64,7 @@
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
:disabled="!!(entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
@@ -256,6 +256,15 @@
>
Cancel
</v-btn>
<v-btn v-if="!id && (entrySequence || entryPoint)"
color="info"
text
title="Enter an event by time"
@click="timed"
>
<v-icon left small>mdi-clock-outline</v-icon>
Timed
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
@@ -632,6 +641,14 @@ export default {
}
},
timed () {
const tstamp = (new Date()).toISOString();
this.entrySequence = null;
this.entryPoint = null;
this.tsDate = tstamp.substr(0, 10);
this.tsTime = tstamp.substr(11, 8);
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);

View File

@@ -2,8 +2,8 @@
<div class="line-status" v-if="sequences.length == 0">
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence"
<div class="line-status" v-else-if="sequenceHref || plannedSequenceHref || pendingReshootHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence" v-if="sequenceHref"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
@@ -11,15 +11,41 @@
:to="sequenceHref(sequence)"
>
</router-link>
<router-link v-for="sequence in plannedSequences" :key="sequence.sequence" v-if="plannedSequenceHref"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
:to="plannedSequenceHref(sequence)"
>
</router-link>
<router-link v-for="(line, key) in pendingReshoots" :key="key" v-if="pendingReshootHref"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
:to="pendingReshootHref(line)"
>
</router-link>
</div>
<div class="line-status" v-else>
<div v-for="sequence in sequences"
<div v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
:title="title(sequence)"
>
</div>
<div v-for="sequence in plannedSequences" :key="sequence.sequence"
class="sequence planned"
:style="style(sequence)"
:title="title(sequence, 'planned')"
>
</div>
<div v-for="(line, key) in pendingReshoots" :key="key"
class="sequence reshoot"
:style="style(line)"
:title="title(line, 'reshoot')"
>
</div>
</div>
</template>
@@ -48,6 +74,8 @@
background-color blue
&.planned
background-color magenta
&.reshoot
background repeating-linear-gradient(-45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px), repeating-linear-gradient(45deg, rgba(255,0,255,0.302), brown 5px, rgba(247, 247, 247, 0.1) 5px, rgba(242, 241, 241, 0.08) 10px)
</style>
<script>
@@ -58,7 +86,11 @@ export default {
props: {
preplot: Object,
sequences: Array,
"sequence-href": Function
"sequence-href": Function,
"planned-sequences": Array,
"planned-sequence-href": Function,
"pending-reshoots": Array,
"pending-reshoot-href": Function
},
methods: {
@@ -68,13 +100,13 @@ export default {
? s.fsp_final
: s.status == "ntbp"
? (s.fsp_final || s.fsp)
: s.fsp; /* status == "raw" */
: s.fsp; /* status == "raw" or planned sequence or pending reshoot */
const lsp = s.status == "final"
? s.lsp_final
: s.status == "ntbp"
? (s.lsp_final || s.lsp)
: s.lsp; /* status == "raw" */
: s.lsp; /* status == "raw" or planned sequence or pending reshoot */
const pp0 = Math.min(this.preplot.fsp, this.preplot.lsp);
const pp1 = Math.max(this.preplot.fsp, this.preplot.lsp);
@@ -91,20 +123,24 @@ export default {
return values;
},
title (s) {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: s.status == "planned"
? "Planned"
: s.status;
title (s, type) {
if (s.status || type == "planned") {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: type == "planned"
? "Planned"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
} else if (type == "reshoot") {
return `Pending reshoot (${s.fsp}${s.lsp})${s.remarks? "\n"+s.remarks : ""}`;
}
}
}

View File

@@ -5,6 +5,11 @@ import api from './modules/api'
import user from './modules/user'
import snack from './modules/snack'
import project from './modules/project'
import event from './modules/event'
import label from './modules/label'
import sequence from './modules/sequence'
import plan from './modules/plan'
import line from './modules/line'
import notify from './modules/notify'
Vue.use(Vuex)
@@ -15,6 +20,11 @@ export default new Vuex.Store({
user,
snack,
project,
event,
label,
sequence,
plan,
line,
notify
}
})

View File

@@ -0,0 +1,129 @@
/** Fetch events from server
*/
async function refreshEvents ({commit, dispatch, state, rootState}, [modifiedAfter] = []) {
if (!modifiedAfter) {
modifiedAfter = state.timestamp;
}
if (state.loading) {
commit('abortEventsLoading');
}
commit('setEventsLoading');
const pid = rootState.project.projectId;
const url = modifiedAfter
? `/project/${pid}/event/changes/${(new Date(modifiedAfter)).toISOString()}?unique=t`
: `/project/${pid}/event`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
if (modifiedAfter) {
commit('setModifiedEvents', res);
} else {
commit('setEvents', res);
}
commit('setEventsTimestamp');
}
commit('clearEventsLoading');
}
/** Return a subset of events from state.events
*/
async function getEvents ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text, label}]) {
let filteredEvents = [...state.events];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredEvents.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredEvents.reverse();
}
});
}
if (sequence) {
filteredEvents = filteredEvents.filter( event => event.sequence == sequence );
}
if (date0 && date1) {
filteredEvents = filteredEvents.filter( event =>
event.tstamp.substr(0, 10) >= date0 && event.tstamp.substr(0, 10) <= date1
);
} else if (date0) {
filteredEvents = filteredEvents.filter( event => event.tstamp.substr(0, 10) == date0 );
}
if (text) {
const tstampFilter = (value, search, item) => {
return textFilter(value, search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
tstamp: tstampFilter,
sequence: numberFilter,
point: numberFilter,
remarks: textFilter,
labels: (value, search, item) => value.some(label => textFilter(label, search, item))
};
filteredEvents = filteredEvents.filter ( event => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(event[key], text, event)) {
return true;
}
}
return false;
});
}
if (label) {
filteredEvents = filteredEvents.filter( event => event.labels?.includes(label) );
}
const count = filteredEvents.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredEvents = filteredEvents.slice(offset, offset+itemsPerPage);
}
return {events: filteredEvents, count};
}
export default { refreshEvents, getEvents };

View File

@@ -0,0 +1,14 @@
function events (state) {
return state.events;
}
function eventCount (state) {
return state.events?.length ?? 0;
}
function eventsLoading (state) {
return !!state.loading;
}
export default { events, eventCount, eventsLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,73 @@
function setEvents (state, events) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.events = Object.freeze(events);
}
/** Selectively replace / insert / delete events
* from state.events.
*
* modifiedEvents is the result of
* /api/project/:project/event/changes?unique=t
*/
function setModifiedEvents (state, modifiedEvents) {
const events = [...state.events];
for (let evt of modifiedEvents) {
const idx = events.findIndex(i => i.id == evt.id);
if (idx != -1) {
if (evt.is_deleted) {
events.splice(idx, 1);
} else {
delete evt.is_deleted;
events.splice(idx, 1, evt);
}
} else {
if (!evt.is_deleted) {
delete evt.is_deleted;
events.unshift(evt);
}
}
}
setEvents(state, events);
}
function setEventsLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
function clearEventsLoading (state) {
state.loading = null;
}
function setEventsTimestamp (state, timestamp = new Date()) {
if (timestamp === true) {
const tstamp = state.events
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setEventsETag (state, etag) {
state.etag = etag;
}
function abortEventsLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setEvents,
setModifiedEvents,
setEventsLoading,
clearEventsLoading,
abortEventsLoading,
setEventsTimestamp,
setEventsETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
events: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,106 @@
/** Fetch labels from server
*/
async function refreshLabels ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortLabelsLoading');
}
commit('setLabelsLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/label`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setLabels', res);
commit('setLabelsTimestamp');
}
commit('clearLabelsLoading');
}
/** Return a subset of labels from state.labels.
*
* Note that, unlike other actions in the get* family,
* the return value is not isomorphic to the state.
*
* While state.labels is an object, getLabels() returns
* an array with each item have the shape:
*
* { label: "labelName", view: {…}, model: {…} }
*
* This is intended to be useful, for instance, for a table
* of labels.
*/
async function getLabels ({commit, dispatch, state}, [projectId, {sortBy, sortDesc, itemsPerPage, page, text, label}]) {
let filteredLabels = Object.entries(state.labels).map(i => {
return {
label: i[0],
...i[1]
}
});
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredLabels.sort( (el0, el1) => {
const a = key == "label" ? el0[0] : el0[1].view[key];
const b = key == "label" ? el1[0] : el1[1].view[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredLabels.reverse();
}
});
}
if (label) {
filteredLabels = filteredLabels.filter( label => label.label == label );
}
if (text) {
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
label: numberFilter,
description: textFilter,
};
filteredLabels = filteredLabels.filter ( item => {
return textFilter(item.label, text, item) ?? textFilter(item.view.description, text, item);
});
}
const count = filteredLabels.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredLabels = filteredLabels.slice(offset, offset+itemsPerPage);
}
return {labels: filteredLabels, count};
}
export default { refreshLabels, getLabels };

View File

@@ -0,0 +1,22 @@
function labels (state) {
return state.labels;
}
/** Return labels that can be added by users.
*
* As opposed to system labels.
*/
function userLabels (state) {
return Object.fromEntries(Object.entries(state.labels).filter(i => i[1].model.user));
}
function labelCount (state) {
return state.labels?.length ?? 0;
}
function labelsLoading (state) {
return !!state.loading;
}
export default { labels, userLabels, labelCount, labelsLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setLabels (state, labels) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.labels = Object.freeze(labels);
}
function setLabelsLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearLabelsLoading (state) {
state.loading = null;
}
function setLabelsTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the labels
// result or in the database schema, but we could add
// one.
if (timestamp === true) {
const tstamp = state.labels
.map( i => i.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setLabelsETag (state, etag) {
state.etag = etag;
}
function abortLabelsLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setLabels,
setLabelsLoading,
clearLabelsLoading,
setLabelsTimestamp,
setLabelsETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
labels: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,117 @@
/** Fetch lines from server
*/
async function refreshLines ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortLinesLoading');
}
commit('setLinesLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/line`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setLines', res);
commit('setLinesTimestamp');
}
commit('clearLinesLoading');
}
/** Return a subset of lines from state.lines
*/
async function getLines ({commit, dispatch, state}, [projectId, {line, fsp, lsp, incr, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredLines = [...state.lines];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredLines.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredLines.reverse();
}
});
}
if (line) {
filteredLines = filteredLines.filter( line => line.line == line );
}
if (fsp) {
filteredLines = filteredLines.filter( line => line.fsp == fsp );
}
if (lsp) {
filteredLines = filteredLines.filter( line => line.lsp == lsp );
}
if (text) {
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const incrFilter = (value, search, item) => {
const inc = /^(incr(ement)?|↑|\+)/i;
const dec = /^(decr(ement)?|↓|-)/i;
return (inc.test(search) && value) || (dec.test(search) && !value)
}
const searchFunctions = {
line: numberFilter,
fsp: numberFilter,
lsp: numberFilter,
remarks: textFilter,
incr: incrFilter,
ntba: (value, search, item) => text.toLowerCase() == "ntba" && value
};
filteredLines = filteredLines.filter ( line => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(line[key], text, line)) {
return true;
}
}
return false;
});
}
const count = filteredLines.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredLines = filteredLines.slice(offset, offset+itemsPerPage);
}
return {lines: filteredLines, count};
}
export default { refreshLines, getLines };

View File

@@ -0,0 +1,14 @@
function lines (state) {
return state.lines;
}
function lineCount (state) {
return state.lines?.length ?? 0;
}
function linesLoading (state) {
return !!state.loading;
}
export default { lines, lineCount, linesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setLines (state, lines) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.lines = Object.freeze(lines);
}
function setLinesLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearLinesLoading (state) {
state.loading = null;
}
function setLinesTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the lines
// result or in the database schema, but we could perhaps add
// one.
if (timestamp === true) {
const tstamp = state.lines
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setLinesETag (state, etag) {
state.etag = etag;
}
function abortLinesLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setLines,
setLinesLoading,
clearLinesLoading,
setLinesTimestamp,
setLinesETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
lines: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -0,0 +1,114 @@
/** Fetch sequences from server
*/
async function refreshPlan ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortPlanLoading');
}
commit('setPlanLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/plan`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setPlan', res);
commit('setPlanTimestamp');
}
commit('clearPlanLoading');
}
/** Return a subset of sequences from state.sequences
*/
async function getPlannedSequences ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredPlannedSequences = [...state.sequences];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredPlannedSequences.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredPlannedSequences.reverse();
}
});
}
if (sequence) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence => sequence.sequence == sequence );
}
if (date0 && date1) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence =>
sequence.ts0.substr(0, 10) >= date0 && sequence.ts1.substr(0, 10) <= date1
);
} else if (date0) {
filteredPlannedSequences = filteredPlannedSequences.filter( sequence => sequence.ts0.substr(0, 10) == date0 || sequence.ts1.substr(0, 10) );
}
if (text) {
const tstampFilter = (value, search, item) => {
return textFilter(value.toISOString(), search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
sequence: numberFilter,
line: numberFilter,
remarks: textFilter,
ts0: tstampFilter,
ts1: tstampFilter
};
filteredPlannedSequences = filteredPlannedSequences.filter ( sequence => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(sequence[key], text, sequence)) {
return true;
}
}
return false;
});
}
const count = filteredPlannedSequences.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredPlannedSequences = filteredPlannedSequences.slice(offset, offset+itemsPerPage);
}
return {sequences: filteredPlannedSequences, count};
}
export default { refreshPlan, getPlannedSequences };

View File

@@ -0,0 +1,18 @@
function planRemarks (state) {
return state.remarks;
}
function plannedSequences (state) {
return state.sequences;
}
function plannedSequenceCount (state) {
return state.sequences?.length ?? 0;
}
function plannedSequencesLoading (state) {
return !!state.loading;
}
export default { planRemarks, plannedSequences, plannedSequenceCount, plannedSequencesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,59 @@
function transform (item) {
item.ts0 = new Date(item.ts0);
item.ts1 = new Date(item.ts1);
return item;
}
// ATTENTION: This relies on the new planner endpoint
// as per issue #281.
function setPlan (state, plan) {
// We don't need or want the planned sequences array to be reactive
state.sequences = Object.freeze(plan.sequences.map(transform));
state.remarks = plan.remarks;
}
function setPlanLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearPlanLoading (state) {
state.loading = null;
}
function setPlanTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the plan
// result or in the database schema, but we should probably add
// one.
if (timestamp === true) {
const tstamp = state.plan
.map( item => item.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setPlanETag (state, etag) {
state.etag = etag;
}
function abortPlanLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setPlan,
setPlanLoading,
clearPlanLoading,
setPlanTimestamp,
setPlanETag
};

View File

@@ -0,0 +1,9 @@
const state = () => ({
sequences: Object.freeze([]),
remarks: null,
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -1,10 +1,11 @@
async function getProject ({commit, dispatch}, projectId) {
const res = await dispatch('api', [`/project/${String(projectId).toLowerCase()}`]);
const res = await dispatch('api', [`/project/${String(projectId).toLowerCase()}/configuration`]);
if (res) {
commit('setProjectName', res.name);
commit('setProjectId', res.pid);
commit('setProjectId', res.id?.toLowerCase());
commit('setProjectSchema', res.schema);
commit('setProjectConfiguration', res);
const recentProjects = JSON.parse(localStorage.getItem("recentProjects") || "[]")
recentProjects.unshift(res);
localStorage.setItem("recentProjects", JSON.stringify(recentProjects.slice(0, 3)));
@@ -12,6 +13,7 @@ async function getProject ({commit, dispatch}, projectId) {
commit('setProjectName', null);
commit('setProjectId', null);
commit('setProjectSchema', null);
commit('setProjectConfiguration', {});
}
}

View File

@@ -11,4 +11,8 @@ function projectSchema (state) {
return state.projectSchema;
}
export default { projectId, projectName, projectSchema };
function projectConfiguration (state) {
return state.projectConfiguration;
}
export default { projectId, projectName, projectSchema, projectConfiguration };

View File

@@ -11,4 +11,8 @@ function setProjectSchema (state, schema) {
state.projectSchema = schema;
}
export default { setProjectId, setProjectName, setProjectSchema };
function setProjectConfiguration (state, configuration) {
state.projectConfiguration = Object.freeze(configuration);
}
export default { setProjectId, setProjectName, setProjectSchema, setProjectConfiguration };

View File

@@ -1,7 +1,8 @@
const state = () => ({
projectId: null,
projectName: null,
projectSchema: null
projectSchema: null,
projectConfiguration: {}
});
export default state;

View File

@@ -0,0 +1,122 @@
/** Fetch sequences from server
*/
async function refreshSequences ({commit, dispatch, state, rootState}) {
if (state.loading) {
commit('abortSequencesLoading');
}
commit('setSequencesLoading');
const pid = rootState.project.projectId;
const url = `/project/${pid}/sequence?files=true`;
const init = {
signal: state.loading.signal
};
const res = await dispatch('api', [url, init]);
if (res) {
commit('setSequences', res);
commit('setSequencesTimestamp');
}
commit('clearSequencesLoading');
}
/** Return a subset of sequences from state.sequences
*/
async function getSequences ({commit, dispatch, state}, [projectId, {sequence, date0, date1, sortBy, sortDesc, itemsPerPage, page, text}]) {
let filteredSequences = [...state.sequences];
if (sortBy) {
sortBy.forEach( (key, idx) => {
filteredSequences.sort( (el0, el1) => {
const a = el0?.[key];
const b = el1?.[key];
if (a < b) {
return -1;
} else if (a > b) {
return 1;
} else if (a == b) {
return 0;
} else if (a && !b) {
return 1;
} else if (!a && b) {
return -1;
} else {
return 0;
}
});
if (sortDesc && sortDesc[idx] === true) {
filteredSequences.reverse();
}
});
}
if (sequence) {
filteredSequences = filteredSequences.filter( sequence => sequence.sequence == sequence );
}
if (date0 && date1) {
filteredSequences = filteredSequences.filter( sequence =>
(sequence.ts0_final ?? sequence.ts0)?.substr(0, 10) >= date0 &&
(sequence.ts1_final ?? sequence.ts1)?.substr(0, 10) <= date1
);
} else if (date0) {
filteredSequences = filteredSequences.filter( sequence => (sequence.ts0_final ?? sequence.ts0)?.substr(0, 10) == date0 || (sequence.ts1_final ?? sequence.ts1)?.substr(0, 10) );
}
if (text) {
const tstampFilter = (value, search, item) => {
return search?.length >= 5 && textFilter(value, search, item);
};
const numberFilter = (value, search, item) => {
return value == search;
};
const textFilter = (value, search, item) => {
return String(value).toLowerCase().includes(search.toLowerCase());
};
const searchFunctions = {
ts0: tstampFilter,
ts1: tstampFilter,
ts0_final: tstampFilter,
ts1_final: tstampFilter,
sequence: numberFilter,
line: numberFilter,
fsp: numberFilter,
lsp: numberFilter,
fsp_final: numberFilter,
fsp_final: numberFilter,
remarks: textFilter,
remarks_final: textFilter
};
filteredSequences = filteredSequences.filter ( sequence => {
for (let key in searchFunctions) {
const fn = searchFunctions[key];
if (fn(sequence[key], text, sequence)) {
return true;
}
}
return false;
});
}
const count = filteredSequences.length;
if (itemsPerPage && itemsPerPage > 0) {
const offset = (page > 0)
? (page-1) * itemsPerPage
: 0;
filteredSequences = filteredSequences.slice(offset, offset+itemsPerPage);
}
return {sequences: filteredSequences, count};
}
export default { refreshSequences, getSequences };

View File

@@ -0,0 +1,14 @@
function sequences (state) {
return state.sequences;
}
function sequenceCount (state) {
return state.sequences?.length ?? 0;
}
function sequencesLoading (state) {
return !!state.loading;
}
export default { sequences, sequenceCount, sequencesLoading };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,49 @@
function setSequences (state, sequences) {
// We don't need or want the events array to be reactive, since
// it can be tens of thousands of items long.
state.sequences = Object.freeze(sequences);
}
function setSequencesLoading (state, abortController = new AbortController()) {
state.loading = abortController;
}
// This assumes that we know any transactions have finished or we
// don't care about aborting.
function clearSequencesLoading (state) {
state.loading = null;
}
function setSequencesTimestamp (state, timestamp = new Date()) {
// NOTE: There is no `modified_on` property in the sequences
// result or in the database schema, but we should probably add
// one.
if (timestamp === true) {
const tstamp = state.sequences
.map( event => event.modified_on )
.reduce( (acc, cur) => acc > cur ? acc : cur );
state.timestamp = tstamp ? new Date(tstamp) : new Date();
} else {
state.timestamp = timestamp;
}
}
function setSequencesETag (state, etag) {
state.etag = etag;
}
function abortSequencesLoading (state) {
if (state.loading) {
state.loading.abort();
}
state.loading = null;
}
export default {
setSequences,
setSequencesLoading,
clearSequencesLoading,
setSequencesTimestamp,
setSequencesETag
};

View File

@@ -0,0 +1,8 @@
const state = () => ({
sequences: Object.freeze([]),
loading: null,
timestamp: null,
etag: null,
});
export default state;

View File

@@ -39,6 +39,12 @@
{{ $refs.calendar.title }}
</v-toolbar-title>
<v-spacer></v-spacer>
<v-btn v-if="categoriesAvailable"
small
class="mx-4"
v-model="useCategories"
@click="useCategories = !useCategories"
>Labels {{useCategories ? "On" : "Off"}}</v-btn>
<v-menu bottom right>
<template v-slot:activator="{ on, attrs }">
<v-btn
@@ -72,16 +78,23 @@
<v-calendar
ref="calendar"
v-model="focus"
:events="events"
:events="items"
:event-color="getEventColour"
color="primary"
:type="type"
:type="view"
:locale-first-day-of-year="4"
:weekdays="weekdays"
:show-week="true"
:category-days="categoryDays"
:categories="categories"
@click:date="showLogForDate"
@click:event="showLogForEvent"
></v-calendar>
@change="setSpan"
>
<template v-slot:event="{ event }">
<div style="height:100%;overflow:scroll;" v-html="event.name"></div>
</template>
</v-calendar>
</v-sheet>
</div>
</template>
@@ -97,8 +110,9 @@ export default {
weekdays: [1, 2, 3, 4, 5, 6, 0],
type: "week",
focus: "",
events: [
],
items: [],
useCategories: false,
span: {},
options: {
sortBy: "sequence"
}
@@ -117,28 +131,126 @@ export default {
return labels[this.type];
},
...mapGetters(['loading'])
view () {
return this.useCategories ? "category" : this.type;
},
categoriesAvailable () {
return this.type == "day" || this.type == "4day";
},
categoryDays () {
if (this.useCategories) {
const days = {
month: 30,
week: 7,
"4day": 4,
day: 1
};
return days[this.type];
}
},
visibleItems () {
return this.items.filter(i => {
const end = i.end ?? i.start;
if (i.start > this.span.end) {
return false;
}
if (end < this.span.start) {
return false;
}
return true;
});
},
categories () {
return [...new Set(this.visibleItems.map(i => i.category ?? "General"))];
},
...mapGetters(['sequencesLoading', 'sequences', 'events'])
},
watch: {
sequences () {
const isFirstLoad = !this.items.length;
this.getEvents();
if (isFirstLoad) {
this.setLast();
}
},
events () {
const isFirstLoad = !this.items.length;
this.getEvents();
if (isFirstLoad) {
this.setLast();
}
},
type () {
this.getEvents();
},
categoriesAvailable (value) {
if (!value) {
this.useCategories = false;
}
}
},
methods: {
async getEvents () {
const query = new URLSearchParams(this.options);
const url = `/project/${this.$route.params.project}/sequence?${query.toString()}`;
const finalSequences = await this.api([url]) || [];
this.events = finalSequences.map(s => {
const sequences = this.sequences.map(s => {
const e = {};
//e.start = s.ts0.substring(0,10)+" "+s.ts0.substring(11,19)
//e.end = s.ts1.substring(0,10)+" "+s.ts1.substring(11,19)
e.routerLink = { name: "logBySequence", params: { sequence: s.sequence } };
e.start = new Date(s.ts0);
e.end = new Date(s.ts1);
e.timed = true;
e.colour = "orange";
e.name = `Sequence ${s.sequence}`;
e.name = `<b>Sequence ${s.sequence}</b><br/>Line ${s.line}<br/><abbr title="Shotpoints">SP</abbr> ${s.fgsp ?? s.fsp}${s.lgsp ?? s.lsp}`;
e.category = "Sequence"
return e;
});
const lineChanges = this.events.filter(i => i.meta?.["*ReportLineChangeTime*"]?.value && i.meta?.["*ReportLineChangeTime*"]?.type != "excess").map(i => {
const e = {};
const duration = i.meta?.["*ReportLineChangeTime*"]?.value;
e.end = new Date(i.tstamp);
e.start = new Date(e.end - duration);
e.timed = true;
e.colour = "pink";
e.name = "Line change";
e.category = "Production"
return e;
});
const excludedLabels = [ "FSP", "FGSP", "LSP", "LGSP", "QC" ];
const otherEvents = this.events.filter(i => !excludedLabels.some(l => i.labels.includes(l))).map(i => {
const e = {};
e.start = new Date(i.tstamp);
e.colour = "brown";
e.timed = true;
e.name = this.$options.filters.markdownInline(i.remarks);
e.category = i.labels[0];
return e;
});
this.items = [...sequences];
if (this.type == "day" || this.type == "4day") {
this.items.push(...lineChanges, ...otherEvents);
}
},
getEventColour (event) {
@@ -150,11 +262,15 @@ export default {
},
setFirst () {
this.focus = this.events[this.events.length-1].start;
if (this.items.length) {
this.focus = this.items[this.items.length-1].start;
}
},
setLast () {
this.focus = this.events[0].start;
if (this.items.length) {
this.focus = this.items[0].start;
}
},
prev () {
@@ -175,6 +291,13 @@ export default {
}
},
setSpan (span) {
this.span = {
start: new Date(span.start.date),
end: new Date((new Date(span.end.date)).valueOf() + 86400000)
};
},
...mapActions(["api"])
@@ -182,9 +305,7 @@ export default {
async mounted () {
await this.getEvents();
if (this.events.length) {
this.setLast();
}
this.setLast();
}
}

View File

@@ -11,6 +11,7 @@
label="Filter"
single-line
clearable
hint="Filter by line number, first or last shotpoint or remarks. Use incr or + / decr or - to show only incrementing / decrementing lines"
></v-text-field>
</v-toolbar>
</v-card-title>
@@ -106,12 +107,14 @@
<v-data-table
:headers="headers"
:items="items"
item-key="line"
:items-per-page.sync="itemsPerPage"
:server-items-length="lineCount"
item-key="line"
:search="filter"
:loading="loading"
:fixed-header="true"
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
:loading="linesLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
:item-class="itemClass"
:show-select="selectOn"
v-model="selectedRows"
@@ -124,6 +127,10 @@
:preplot="item"
:sequences="sequences.filter(s => s.line == item.line)"
:sequence-href="(s) => `/projects/${$route.params.project}/log/sequence/${s.sequence}`"
:planned-sequences="plannedSequences.filter(s => s.line == item.line)"
:planned-sequence-href="() => `/projects/${$route.params.project}/plan`"
:pending-reshoots="null"
:pending-reshoot-href="null"
>
<template v-slot:empty>
<div v-if="!item.ntba" class="sequence" title="Virgin"></div>
@@ -161,7 +168,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="linesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -251,9 +258,10 @@ export default {
items: [],
selectOn: false,
selectedRows: [],
filter: null,
num_lines: null,
sequences: [],
filter: "",
options: {},
lineCount: null,
//sequences: [],
activeItem: null,
edit: null, // {line, key, value}
queuedReload: false,
@@ -273,11 +281,22 @@ export default {
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'linesLoading', 'lines', 'sequences', 'plannedSequences'])
},
watch: {
options: {
handler () {
this.fetchLines();
},
deep: true
},
async lines () {
await this.fetchLines();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.line == oldVal.line);
@@ -296,39 +315,9 @@ export default {
}
},
async serverEvent (event) {
if (event.payload.pid == this.$route.params.project) {
if (event.channel == "preplot_lines" || event.channel == "preplot_points") {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getLines();
} else {
this.queuedReload = true;
}
} else if ([ "planned_lines", "raw_lines", "final_lines" ].includes(event.channel)) {
if (!this.loading && !this.queuedReload) {
this.getSequences();
} else {
this.queuedReload = true;
}
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getLines();
this.getSequences();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getLines();
this.getSequences();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchLines();
}
},
@@ -468,43 +457,28 @@ export default {
}
},
async getNumLines () {
const projectInfo = await this.api([`/project/${this.$route.params.project}`]);
this.num_lines = projectInfo.lines;
},
async getLines () {
const url = `/project/${this.$route.params.project}/line`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
},
async getSequences () {
const urlS = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([urlS]) || [];
const urlP = `/project/${this.$route.params.project}/plan`;
const planned = await this.api([urlP]) || [];
planned.forEach(i => i.status = "planned");
this.sequences.push(...planned);
},
setActiveItem (item) {
this.activeItem = this.activeItem == item
? null
: item;
},
...mapActions(["api"])
async fetchLines (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getLines([this.$route.params.project, options]);
this.items = res.lines;
this.lineCount = res.count;
},
...mapActions(["api", "getLines"])
},
mounted () {
this.getLines();
this.getNumLines();
this.getSequences();
this.fetchLines();
// Initialise stylesheet
const el = document.createElement("style");

View File

@@ -93,7 +93,21 @@
append-icon="mdi-magnify"
label="Filter"
single-line
hide-details></v-text-field>
clearable
hide-details>
<template v-slot:prepend-inner>
<v-chip v-if="labelSearch"
class="mr-1"
small
close
@click:close="labelSearch=null"
:color="labels[labelSearch] && labels[labelSearch].view.colour"
:title="labels[labelSearch] && labels[labelSearch].view.description"
:dark="labels[labelSearch] && labels[labelSearch].view.dark"
:light="labels[labelSearch] && labels[labelSearch].view.light"
>{{labelSearch}}</v-chip>
</template>
</v-text-field>
</v-toolbar>
</v-card-title>
<v-card-text>
@@ -215,13 +229,14 @@
:headers="headers"
:items="rows"
:items-per-page.sync="itemsPerPage"
:server-items-length="eventCount"
item-key="key"
:item-class="itemClass"
sort-by="tstamp"
:sort-desc="true"
:search="filter"
:custom-filter="searchTable"
:loading="loading"
:loading="eventsLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
@click:row="setActiveItem"
@@ -249,12 +264,12 @@
:dark="labels[label] && labels[label].view.dark"
:light="labels[label] && labels[label].view.light"
:key="label"
:href="$route.path+'?label='+encodeURIComponent(label)"
@click="labelSearch=label"
>{{label}}</v-chip>
</span>
<dougal-event-edit-history v-if="entry.has_edits && writeaccess"
:id="entry.id"
:disabled="loading"
:disabled="eventsLoading"
:labels="labels"
></dougal-event-edit-history>
<span v-if="entry.meta.readonly"
@@ -385,7 +400,6 @@ export default {
}
],
items: [],
labels: {},
options: {},
filter: "",
filterableLabels: [ "QC", "QCAccepted" ],
@@ -394,7 +408,6 @@ export default {
eventDialog: false,
eventLabelsDialog: false,
defaultEventTimestamp: null,
presetRemarks: null,
remarksMenu: null,
remarksMenuItem: null,
editedEvent: {},
@@ -444,17 +457,6 @@ export default {
return Object.values(rows);
},
userLabels () {
const filtered = {};
for (const key in this.labels) {
if (this.labels[key].model.user) {
filtered[key] = this.labels[key];
}
}
return filtered;
},
popularLabels () {
const tuples = this.items.flatMap( i => i.labels )
.filter( l => (this.labels[l]??{})?.model?.user )
@@ -466,6 +468,10 @@ export default {
.sort( (a, b) => b[1]-a[1] );
},
presetRemarks () {
return this.projectConfiguration?.events?.presetRemarks ?? [];
},
defaultSequence () {
if (this.$route.params.sequence) {
return Number(this.$route.params.sequence.split(";").pop());
@@ -474,19 +480,24 @@ export default {
}
},
...mapGetters(['user', 'writeaccess', 'loading', 'online', 'sequence', 'line', 'point', 'position', 'timestamp', 'lineName', 'serverEvent']),
...mapGetters(['user', 'writeaccess', 'eventsLoading', 'online', 'sequence', 'line', 'point', 'position', 'timestamp', 'lineName', 'serverEvent', 'events', 'labels', 'userLabels']),
...mapState({projectSchema: state => state.project.projectSchema})
},
watch: {
options: {
handler () {
//this.getEvents();
async handler () {
await this.fetchEvents();
},
deep: true
},
async events () {
console.log("Events changed");
await this.fetchEvents();
},
eventDialog (val) {
if (val) {
// If not online
@@ -494,30 +505,14 @@ export default {
}
},
async serverEvent (event) {
if (event.channel == "event" && event.payload.schema == this.projectSchema) {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getEvents();
} else {
this.queuedReload = true;
}
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchEvents();
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getEvents();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getEvents();
}
labelSearch () {
this.fetchEvents();
},
itemsPerPage (newVal, oldVal) {
@@ -574,50 +569,15 @@ export default {
}
},
async getEventCount () {
//this.eventCount = await this.api([`/project/${this.$route.params.project}/event/?count`]);
this.eventCount = null;
},
async getEvents (opts = {}) {
const query = new URLSearchParams(this.options);
if (this.options.itemsPerPage < 0) {
query.delete("itemsPerPage");
}
if (this.$route.params.sequence) {
query.set("sequence", this.$route.params.sequence);
}
if (this.$route.params.date0) {
query.set("date0", this.$route.params.date0);
}
if (this.$route.params.date1) {
query.set("date1", this.$route.params.date1);
}
const url = `/project/${this.$route.params.project}/event?${query.toString()}`;
this.queuedReload = false;
this.items = await this.api([url, opts]) || [];
},
async getLabelDefinitions () {
const url = `/project/${this.$route.params.project}/label`;
//const labelSet = {};
this.labels = await this.api([url]) ?? {};
//labels.forEach( l => labelSet[l.name] = l.data );
//this.labels = labelSet;
},
async getPresetRemarks () {
const url = `/project/${this.$route.params.project}/configuration`;
this.presetRemarks = (await this.api([url]))?.events?.presetRemarks ?? {};
async fetchEvents (opts = {}) {
const options = {
text: this.filter,
label: this.labelSearch,
...this.options
};
const res = await this.getEvents([this.$route.params.project, options]);
this.items = res.events;
this.eventCount = res.count;
},
newItem (from = {}) {
@@ -691,7 +651,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -709,7 +669,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event saved", "success"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -756,7 +716,7 @@ export default {
if (!err && res.ok) {
this.showSnack([`${ids.length} events deleted`, "red"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -772,7 +732,7 @@ export default {
if (!err && res.ok) {
this.showSnack(["Event deleted", "red"]);
this.queuedReload = true;
this.getEvents({cache: "reload"});
this.fetchEvents({cache: "reload"});
}
}
@@ -806,19 +766,6 @@ export default {
},
searchTable (value, search, item) {
if (!value && !search) return true;
const s = search.toLowerCase();
if (typeof value === 'string') {
return value.toLowerCase().includes(s);
} else if (typeof value === 'number') {
return value == search;
} else {
return item.items.some( i => i.remarks.toLowerCase().includes(s) ) ||
item.items.some( i => i.labels.some( l => l.toLowerCase().includes(s) ));
}
},
viewOnMap(item) {
if (item?.meta && item.meta?.geometry?.type == "Point") {
const [ lon, lat ] = item.meta.geometry.coordinates;
@@ -857,14 +804,11 @@ export default {
*/
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "refreshEvents", "getEvents"])
},
async mounted () {
await this.getLabelDefinitions();
this.getEventCount();
this.getEvents();
this.getPresetRemarks();
this.fetchEvents();
window.addEventListener('keyup', this.handleKeyboardEvent);
},

View File

@@ -44,6 +44,7 @@
label="Filter"
single-line
clearable
hint="Filter by sequence, line, first or last shotpoints, remarks or start/end time"
></v-text-field>
</v-toolbar>
</v-card-title>
@@ -109,11 +110,14 @@
:headers="headers"
:items="items"
:items-per-page.sync="itemsPerPage"
:server-items-length="sequenceCount"
item-key="sequence"
:search="filter"
:loading="loading"
:fixed-header="true"
:loading="plannedSequencesLoading"
fixed-header
no-data-text="No planned lines. Add lines via the context menu from either the Lines or Sequences view."
:item-class="(item) => (activeItem == item && !edit) ? 'blue accent-1 elevation-3' : ''"
:footer-props="{showFirstLastPage: true}"
@click:row="setActiveItem"
@contextmenu:row="contextMenu"
>
@@ -275,7 +279,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="plannedSequencesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -417,7 +421,8 @@ export default {
remarks: null,
editRemarks: false,
filter: null,
num_lines: null,
options: {},
sequenceCount: null,
activeItem: null,
edit: null, // {sequence, key, value}
queuedReload: false,
@@ -552,11 +557,22 @@ export default {
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'plannedSequencesLoading', 'plannedSequences', 'planRemarks'])
},
watch: {
options: {
handler () {
this.fetchPlannedSequences();
},
deep: true
},
async plannedSequences () {
await this.fetchPlannedSequences();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.sequence == oldVal.sequence);
@@ -587,41 +603,9 @@ export default {
}
},
async serverEvent (event) {
if (event.channel == "planned_lines" && event.payload.pid == this.$route.params.project) {
// Ignore non-ops
/*
if (event.payload.old === null && event.payload.new === null) {
return;
}
*/
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getPlannedLines();
} else {
this.queuedReload = true;
}
} else if (event.channel == "info" && event.payload.pid == this.$route.params.project) {
if (event.payload?.new?.key == "plan" && ("remarks" in (event.payload?.new?.value || {}))) {
this.remarks = event.payload?.new.value.remarks;
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getPlannedLines();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getPlannedLines();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchPlannedSequences();
}
},
@@ -890,7 +874,6 @@ export default {
const url = `/project/${this.$route.params.project}/plan/${this.contextMenuItem.sequence}`;
const init = {method: "DELETE"};
await this.api([url, init]);
await this.getPlannedLines();
},
editItem (item, key, value) {
@@ -942,21 +925,6 @@ export default {
}
},
async getPlannedLines () {
const url = `/project/${this.$route.params.project}/plan`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
for (const item of this.items) {
item.ts0 = new Date(item.ts0);
item.ts1 = new Date(item.ts1);
this.wxQuery(item).then( (wx) => {
item.meta = {...item.meta, wx};
});
}
},
async getPlannerConfig () {
const url = `/project/${this.$route.params.project}/configuration/planner`;
this.plannerConfig = await this.api([url]) || {
@@ -967,14 +935,15 @@ export default {
}
},
async getPlannerRemarks () {
const url = `/project/${this.$route.params.project}/info/plan/remarks`;
this.remarks = await this.api([url]) || "";
},
async getSequences () {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequences = await this.api([url]) || [];
async fetchPlannedSequences (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getPlannedSequences([this.$route.params.project, options]);
this.items = res.sequences;
this.sequenceCount = res.count;
this.remarks = this.planRemarks;
},
setActiveItem (item) {
@@ -983,13 +952,12 @@ export default {
: item;
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "getPlannedSequences"])
},
async mounted () {
await this.getPlannerConfig();
this.getPlannedLines();
this.getPlannerRemarks();
await this.fetchPlannedSequences();
}
}

View File

@@ -37,7 +37,7 @@ export default {
return this.loading || this.projectId;
},
...mapGetters(["loading", "projectId", "serverEvent"])
...mapGetters(["loading", "projectId", "projectSchema", "serverEvent"])
},
watch: {
@@ -45,16 +45,39 @@ export default {
if (event.channel == "project" && event.payload?.operation == "DELETE" && event.payload?.schema == "public") {
// Project potentially deleted
await this.getProject(this.$route.params.project);
} else if (event.payload?.schema == this.projectSchema) {
if (event.channel == "event") {
this.refreshEvents();
} else if (event.channel == "planned_lines") {
this.refreshPlan();
} else if (["raw_lines", "final_lines", "final_shots"].includes(event.channel)) {
this.refreshSequences();
} else if (["preplot_lines", "preplot_points"].includes(event.channel)) {
this.refreshLines();
} else if (event.channel == "info") {
if ((event.payload?.new ?? event.payload?.old)?.key == "plan") {
this.refreshPlan();
}
} else if (event.channel == "project") {
this.getProject(this.$route.params.project);
}
}
}
},
methods: {
...mapActions(["getProject"])
...mapActions(["getProject", "refreshLines", "refreshSequences", "refreshEvents", "refreshLabels", "refreshPlan"])
},
async mounted () {
await this.getProject(this.$route.params.project);
if (this.projectFound) {
this.refreshLines();
this.refreshSequences();
this.refreshEvents();
this.refreshLabels();
this.refreshPlan();
}
}
}

View File

@@ -148,15 +148,16 @@
:headers="headers"
:items="items"
:items-per-page.sync="itemsPerPage"
:server-items-length="sequenceCount"
item-key="sequence"
:server-items-length="num_rows"
:search="filter"
:custom-filter="customFilter"
:loading="loading"
:fixed-header="true"
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
show-expand
:item-class="(item) => activeItem == item ? 'blue accent-1 elevation-3' : ''"
:search="filter"
x-custom-filter="customFilter"
:loading="sequencesLoading"
:options.sync="options"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ], showFirstLastPage: true}'
show-expand
@click:row="setActiveItem"
@contextmenu:row="contextMenu"
>
@@ -176,7 +177,7 @@
icon
small
title="Cancel edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit.value = item.remarks; edit = null"
>
<v-icon small>mdi-close</v-icon>
@@ -185,7 +186,7 @@
icon
small
title="Save edits"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
@@ -196,7 +197,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="editItem(item, 'remarks')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -210,7 +211,7 @@
class="markdown"
autofocus
placeholder="Enter your text here"
:disabled="loading"
:disabled="sequencesLoading"
v-model="edit.value"
>
</v-textarea>
@@ -228,7 +229,7 @@
icon
small
title="Cancel edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit.value = item.remarks_final; edit = null"
>
<v-icon small>mdi-close</v-icon>
@@ -237,7 +238,7 @@
icon
small
title="Save edits"
:disabled="loading"
:disabled="sequencesLoading"
@click="edit = null"
>
<v-icon small>mdi-content-save-edit-outline</v-icon>
@@ -248,7 +249,7 @@
icon
small
title="Edit"
:disabled="loading"
:disabled="sequencesLoading"
@click="editItem(item, 'remarks_final')"
>
<v-icon small>mdi-square-edit-outline</v-icon>
@@ -262,7 +263,7 @@
class="markdown"
autofocus
placeholder="Enter your text here"
:disabled="loading"
:disabled="sequencesLoading"
v-model="edit.value"
>
</v-textarea>
@@ -566,7 +567,7 @@ export default {
items: [],
filter: "",
options: {},
num_rows: null,
sequenceCount: null,
activeItem: null,
edit: null, // {sequence, key, value}
queuedReload: false,
@@ -593,17 +594,22 @@ export default {
return this.queuedItems.find(i => i.payload.sequence == this.contextMenuItem.sequence);
},
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
...mapGetters(['user', 'writeaccess', 'sequencesLoading', 'sequences'])
},
watch: {
options: {
handler () {
this.getSequences();
this.fetchSequences();
},
deep: true
},
async sequences () {
await this.fetchSequences();
},
async edit (newVal, oldVal) {
if (newVal === null && oldVal !== null) {
const item = this.items.find(i => i.sequence == oldVal.sequence);
@@ -617,39 +623,9 @@ export default {
}
},
async serverEvent (event) {
const subscriptions = ["raw_lines", "final_lines", "final_shots"];
if (subscriptions.includes(event.channel) && event.payload.pid == this.$route.params.project) {
if (!this.loading && !this.queuedReload) {
// Do not force a non-cached response if refreshing as a result
// of an event notification. We will assume that the server has
// already had time to update the cache by the time our request
// gets back to it.
this.getSequences();
} else {
this.queuedReload = true;
}
} else if (event.channel == "queue_items") {
const project =
event.payload?.project ??
event.payload?.new?.payload?.project ??
event.payload?.old?.payload?.project;
if (project == this.$route.params.project) {
this.getQueuedItems();
}
}
},
queuedReload (newVal, oldVal) {
if (newVal && !oldVal && !this.loading) {
this.getSequences();
}
},
loading (newVal, oldVal) {
if (!newVal && oldVal && this.queuedReload) {
this.getSequences();
filter (newVal, oldVal) {
if (newVal?.toLowerCase() != oldVal?.toLowerCase()) {
this.fetchSequences();
}
},
@@ -818,19 +794,14 @@ export default {
this.num_rows = projectInfo.sequences;
},
async getSequences () {
const query = new URLSearchParams(this.options);
query.set("filter", this.filter);
query.set("files", true);
if (this.options.itemsPerPage < 0) {
query.delete("itemsPerPage");
}
const url = `/project/${this.$route.params.project}/sequence?${query.toString()}`;
this.queuedReload = false;
this.items = await this.api([url]) || [];
async fetchSequences (opts = {}) {
const options = {
text: this.filter,
...this.options
};
const res = await this.getSequences([this.$route.params.project, options]);
this.items = res.sequences;
this.sequenceCount = res.count;
},
async getQueuedItems () {
@@ -878,11 +849,11 @@ export default {
return false;
},
...mapActions(["api", "showSnack"])
...mapActions(["api", "showSnack", "getSequences"])
},
mounted () {
this.getSequences();
this.fetchSequences();
this.getNumLines();
this.getQueuedItems();
}

View File

@@ -181,6 +181,9 @@ app.map({
post: [ mw.auth.access.write, mw.event.post ],
put: [ mw.auth.access.write, mw.event.put ],
delete: [ mw.auth.access.write, mw.event.delete ],
'changes/:since': {
get: [ mw.event.changes ]
},
// TODO Rename -/:sequence → sequence/:sequence
'-/:sequence/': { // NOTE: We need to avoid conflict with the next endpoint ☹
get: [ mw.event.sequence.get ],

View File

@@ -43,15 +43,26 @@ const rels = [
matches: [ ],
callback (url, data) {
if (data.payload?.table == "info") {
const pid = data.payload?.pid;
const key = (data.payload?.new ?? data.payload?.old)?.key;
const rx = /^\/project\/([^\/]+)\/info\/([^\/?]+)[\/?]?/;
const match = url.match(rx);
if (match) {
if (match[1] == data.payload.pid) {
if (match[1] == pid) {
if (match[2] == data.payload?.old?.key || match[2] == data.payload?.new?.key) {
return true;
}
}
}
if (key == "plan") {
const rx = /^\/project\/([^\/]+)\/plan[\/?]?/;
const match = url.match(rx);
if (match) {
return match[1] == pid;
}
}
}
return false;
}

View File

@@ -0,0 +1,14 @@
const { event } = require('../../../lib/db');
const json = async function (req, res, next) {
try {
const response = await event.changes(req.params.project, req.params.since, req.query);
res.status(200).send(response);
next();
} catch (err) {
next(err);
}
};
module.exports = json;

View File

@@ -6,5 +6,6 @@ module.exports = {
post: require('./post'),
put: require('./put'),
patch: require('./patch'),
delete: require('./delete')
delete: require('./delete'),
changes: require('./changes')
}

View File

@@ -1,9 +1,14 @@
const { plan } = require('../../../../lib/db');
const { plan, info } = require('../../../../lib/db');
const json = async function (req, res, next) {
try {
const response = await plan.list(req.params.project, req.query);
const sequences = await plan.list(req.params.project, req.query) ?? [];
const remarks = await info.get(req.params.project, "plan/remarks", req.query, req.user.role) ?? null;
const response = {
remarks,
sequences
};
res.status(200).send(response);
next();
} catch (err) {

View File

@@ -9,105 +9,16 @@ const { ALERT, ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
* the last shot and first shot of the previous and current dates, respectively.
*/
class DetectFDSP {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks at the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
author = `*${this.constructor.name}*`;
prev = null;
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
const sequence = Number(cur._sequence);
try {
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus == "online" && cur.lineStatus == "online" && sequence) {
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
if (prev.time.substr(0, 10) != cur.time.substr(0, 10)) {
// Possible a date change, but could also be a missing timestamp
// or something else.
const ts0 = new Date(prev.time)
const ts1 = new Date(cur.time);
if (!isNaN(ts0) && !isNaN(ts1) && ts0.getUTCDay() != ts1.getUTCDay()) {
INFO("Sequence shot across midnight UTC detected", cur._sequence, cur.lineName);
const ldsp = {
sequence: prev._sequence,
point: prev._point,
remarks: "Last shotpoint of the day",
labels: ["LDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
};
const fdsp = {
sequence: cur._sequence,
point: cur._point,
remarks: "First shotpoint of the day",
labels: ["FDSP", "Prod"],
meta: {auto: true, insertedBy: this.constructor.name}
};
INFO("LDSP", ldsp);
INFO("FDSP", fdsp);
const projectId = await schema2pid(prev._schema);
if (projectId) {
await event.post(projectId, ldsp);
await event.post(projectId, fdsp);
} else {
ERROR("projectId not found for", prev._schema);
}
} else {
WARNING("False positive on these timestamps", prev.time, cur.time);
WARNING("No events were created");
}
}
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
ERROR(err);
} finally {
cur.isPending = false;
}
}
constructor () {
DEBUG(`${this.author} instantiated`);
}
async run (data) {
async run (data, ctx) {
if (!data || data.channel !== "realtime") {
return;
}
@@ -116,27 +27,70 @@ class DetectFDSP {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectFDSP.MAX_QUEUE_SIZE) {
const event = {
isPending: this.queue.length,
_schema: meta._schema,
time: meta.time,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName
};
this.queue.push(event);
// DEBUG("EVENT", event);
} else {
ALERT("Queue full at", this.queue.length);
if (!this.prev) {
DEBUG("Initialising `prev`");
this.prev = data;
return;
}
this.processQueue();
try {
DEBUG("Running");
const cur = data;
const sequence = Number(cur._sequence);
if (this.prev.lineName == cur.lineName && this.prev._sequence == cur._sequence &&
this.prev.lineStatus == "online" && cur.lineStatus == "online" && sequence) {
if (this.prev.time.substr(0, 10) != cur.time.substr(0, 10)) {
// Possibly a date change, but could also be a missing timestamp
// or something else.
const ts0 = new Date(this.prev.time)
const ts1 = new Date(cur.time);
if (!isNaN(ts0) && !isNaN(ts1) && ts0.getUTCDay() != ts1.getUTCDay()) {
INFO("Sequence shot across midnight UTC detected", cur._sequence, cur.lineName);
const ldsp = {
sequence: this.prev._sequence,
point: this.prev._point,
remarks: "Last shotpoint of the day",
labels: ["LDSP", "Prod"],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
const fdsp = {
sequence: cur._sequence,
point: cur._point,
remarks: "First shotpoint of the day",
labels: ["FDSP", "Prod"],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
INFO("LDSP", ldsp);
INFO("FDSP", fdsp);
const projectId = await schema2pid(this.prev._schema);
if (projectId) {
await event.post(projectId, ldsp);
await event.post(projectId, fdsp);
} else {
ERROR("projectId not found for", this.prev._schema);
}
} else {
WARNING("False positive on these timestamps", this.prev.time, cur.time);
WARNING("No events were created");
}
}
}
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
} finally {
this.prev = data;
}
}
}

View File

@@ -0,0 +1,60 @@
const project = require('../../lib/db/project');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectProjectConfigurationChange {
author = `*${this.constructor.name}*`;
constructor (ctx) {
DEBUG(`${this.author} instantiated`);
// Grab project configurations.
// NOTE that this will run asynchronously
this.run({channel: "project"}, ctx);
}
async run (data, ctx) {
if (!data || data.channel !== "project") {
return;
}
// Project notifications, as of this writing, most likely
// do not carry payloads as those exceed the notification
// size limit.
// For our purposes, we do not care as we just re-read all
// the configurations for all non-archived projects.
try {
DEBUG("Project configuration change detected")
const projects = await project.get();
const _ctx_data = {};
for (let pid of projects.map(i => i.pid)) {
DEBUG("Retrieving configuration for", pid);
const cfg = await project.configuration.get(pid);
if (cfg?.archived === true) {
DEBUG(pid, "is archived. Ignoring");
continue;
}
DEBUG("Saving configuration for", pid);
_ctx_data[pid] = cfg;
}
if (! ("projects" in ctx)) {
ctx.projects = {};
}
ctx.projects.configuration = _ctx_data;
DEBUG("Committed project configuration to ctx.projects.configuration");
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
}
}
}
module.exports = DetectProjectConfigurationChange;

View File

@@ -3,94 +3,16 @@ const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSoftStart {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
author = `*${this.constructor.name}*`;
prev = null;
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
DEBUG("Queue busy");
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
try {
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
// TODO:
// Consider whether to remember if soft start / full volume events
// have already been emitted and wait until there is an online/offline
// transition before re-emitting.
// This may or may not be a good idea.
// Look for a soft start or full volume event
if (cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns) {
INFO("Soft start detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Soft start",
labels: [ "Daily", "Guns", "Prod" ],
meta: { author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
} else if (cur.num_active == cur.num_guns && prev.num_active < cur.num_active) {
INFO("Full volume detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Full volume",
labels: [ "Daily", "Guns", "Prod" ],
meta: { author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
ERROR("DetectSoftStart Error")
ERROR(err);
} finally {
cur.isPending = false;
}
}
constructor () {
DEBUG(`${this.author} instantiated`);
}
async run (data) {
async run (data, ctx) {
if (!data || data.channel !== "realtime") {
return;
}
@@ -99,29 +21,59 @@ class DetectSoftStart {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectSoftStart.MAX_QUEUE_SIZE) {
this.queue.push({
isPending: this.queue.length,
_schema: meta._schema,
tstamp: meta.tstamp ?? meta.time,
shot: meta.shot,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName,
num_guns: meta.num_guns,
num_active: meta.num_active
});
} else {
// FIXME Change to alert
ALERT("DetectSoftStart queue full at", this.queue.length);
if (!this.prev) {
DEBUG("Initialising `prev`");
this.prev = data;
return;
}
this.processQueue();
try {
DEBUG("Running");
const cur = data?.payload?.new?.meta;
const prev = this.prev?.payload?.new?.meta;
// DEBUG("%j", prev);
// DEBUG("%j", cur);
DEBUG("cur.num_guns: %d\ncur.num_active: %d\nprv.num_active: %d\ntest passed: %j", cur.num_guns, cur.num_active, prev.num_active, cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns);
if (cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns) {
INFO("Soft start detected @", cur.tstamp);
// FIXME Shouldn't need to use schema2pid as pid already present in payload.
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Soft start",
labels: [ "Daily", "Guns", "Prod" ],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
} else if (cur.num_active == cur.num_guns && prev.num_active < cur.num_active) {
INFO("Full volume detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
tstamp: cur.tstamp,
remarks: "Full volume",
labels: [ "Daily", "Guns", "Prod" ],
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
}
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
} finally {
this.prev = data;
}
}
}

View File

@@ -3,128 +3,15 @@ const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSOLEOL {
/* Data may come much faster than we can process it, so we put it
* in a queue and process it at our own pace.
*
* The run() method fills the queue with the necessary data and then
* calls processQueue().
*
* The processQueue() method looks takes the first two elements in
* the queue and processes them if they are not already being taken
* care of by a previous processQueue() call this will happen when
* data is coming in faster than it can be processed.
*
* If the processQueue() call is the first to see the two bottommost
* two elements, it will process them and, when finished, it will set
* the `isPending` flag of the bottommost element to `false`, thus
* letting the next call know that it has work to do.
*
* If the queue was empty, run() will set the `isPending` flag of its
* first element to a falsy value, thus bootstrapping the process.
*/
static MAX_QUEUE_SIZE = 125000;
queue = [];
author = `*${this.constructor.name}*`;
prev = null;
async processQueue () {
DEBUG("Queue length", this.queue.length)
while (this.queue.length > 1) {
if (this.queue[0].isPending) {
DEBUG("Queue busy");
setImmediate(() => this.processQueue());
return;
}
const prev = this.queue.shift();
const cur = this.queue[0];
const sequence = Number(cur._sequence);
try {
DEBUG("Sequence", sequence);
// DEBUG("Previous", prev);
// DEBUG("Current", cur);
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus != "online" && cur.lineStatus == "online" && sequence) {
INFO("Transition to ONLINE detected");
// DEBUG(cur);
// DEBUG(prev);
// console.log("TRANSITION TO ONLINE", prev, cur);
// Check if there are already FSP, FGSP events for this sequence
const projectId = await schema2pid(cur._schema);
const sequenceEvents = await event.list(projectId, {sequence});
const labels = ["FSP", "FGSP"].filter(l => !sequenceEvents.find(i => i.labels.includes(l)));
if (labels.includes("FSP")) {
// At this point labels contains either FSP only or FSP + FGSP,
// depending on whether a FGSP event has already been entered.
const remarks = `SEQ ${cur._sequence}, SOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: cur._point,
remarks,
labels
}
// console.log(projectId, payload);
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
INFO("FSP already in the log. Doing nothing");
}
} else if (prev.lineStatus == "online" && cur.lineStatus != "online") {
INFO("Transition to OFFLINE detected");
// DEBUG(cur);
// DEBUG(prev);
// console.log("TRANSITION TO OFFLINE", prev, cur);
// Check if there are already LSP, LGSP events for this sequence
const projectId = await schema2pid(prev._schema);
const sequenceEvents = await event.list(projectId, {sequence});
const labels = ["LSP", "LGSP"].filter(l => !sequenceEvents.find(i => i.labels.includes(l)));
if (labels.includes("LSP")) {
// At this point labels contains either LSP only or LSP + LGSP,
// depending on whether a LGSP event has already been entered.
const remarks = `SEQ ${prev._sequence}, EOL ${prev.lineName}, BSP: ${(prev.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(prev.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: prev._point,
remarks,
labels
}
// console.log(projectId, payload);
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
} else {
// A first shot point has been already entered in the log,
// so we have nothing to do here.
INFO("LSP already in the log. Doing nothing");
}
}
// Processing of this shot has already been completed.
// The queue can now move forward.
} catch (err) {
console.error("DetectSOLEOL Error")
console.log(err);
} finally {
cur.isPending = false;
}
}
constructor () {
DEBUG(`${this.author} instantiated`);
}
async run (data) {
async run (data, ctx) {
if (!data || data.channel !== "realtime") {
return;
}
@@ -133,30 +20,69 @@ class DetectSOLEOL {
return;
}
const meta = data.payload.new.meta;
if (this.queue.length < DetectSOLEOL.MAX_QUEUE_SIZE) {
this.queue.push({
isPending: this.queue.length,
_schema: meta._schema,
time: meta.time,
shot: meta.shot,
lineStatus: meta.lineStatus,
_sequence: meta._sequence,
_point: meta._point,
lineName: meta.lineName,
speed: meta.speed,
waterDepth: meta.waterDepth
});
} else {
// FIXME Change to alert
console.error("DetectSOLEOL queue full at", this.queue.length);
if (!this.prev) {
DEBUG("Initialising `prev`");
this.prev = data;
return;
}
this.processQueue();
try {
DEBUG("Running");
// DEBUG("%j", data);
const cur = data?.payload?.new?.meta;
const prev = this.prev?.payload?.new?.meta;
const sequence = Number(cur._sequence);
// DEBUG("%j", prev);
// DEBUG("%j", cur);
DEBUG("prv.lineName: %s\ncur.lineName: %s\nprv._sequence: %s\ncur._sequence: %s\nprv.lineStatus: %s\ncur.lineStatus: %s", prev.lineName, cur.lineName, prev._sequence, cur._sequence, prev.lineStatus, cur.lineStatus);
if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus != "online" && cur.lineStatus == "online" && sequence) {
INFO("Transition to ONLINE detected");
// We must use schema2pid because the pid may not have been
// populated for this event.
const projectId = await schema2pid(cur._schema ?? prev._schema);
const labels = ["FSP", "FGSP"];
const remarks = `SEQ ${cur._sequence}, SOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: cur._point,
remarks,
labels,
meta: {auto: true, author: `*${this.constructor.name}*`}
}
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
} else if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus == "online" && cur.lineStatus != "online" && sequence) {
INFO("Transition to OFFLINE detected");
const projectId = await schema2pid(prev._schema ?? cur._schema);
const labels = ["LSP", "LGSP"];
const remarks = `SEQ ${cur._sequence}, EOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: cur._point,
remarks,
labels,
meta: {auto: true, author: `*${this.constructor.name}*`}
}
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
}
} catch (err) {
DEBUG(`${this.author} error`, err);
throw err;
} finally {
this.prev = data;
}
}
}
module.exports = DetectSOLEOL;

View File

@@ -1,14 +1,44 @@
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const Handlers = [
require('./detect-project-configuration-change'),
require('./detect-soleol'),
require('./detect-soft-start'),
require('./report-line-change-time'),
require('./detect-fdsp')
];
function init () {
return Handlers.map(Handler => new Handler());
function init (ctx) {
const instances = Handlers.map(Handler => new Handler(ctx));
function prepare (data, ctx) {
const promises = [];
for (let instance of instances) {
const promise = new Promise(async (resolve, reject) => {
try {
DEBUG("Run", instance.author);
const result = await instance.run(data, ctx);
DEBUG("%s result: %O", instance.author, result);
resolve(result);
} catch (err) {
ERROR("%s error:\n%O", instance.author, err);
reject(err);
}
});
promises.push(promise);
}
return promises;
}
function despatch (data, ctx) {
return Promise.allSettled(prepare(data, ctx));
}
return { instances, prepare, despatch };
}
module.exports = {
Handlers,
init
}
};

View File

@@ -0,0 +1,231 @@
const { event, project } = require('../../lib/db');
const { withinValidity } = require('../../lib/utils/ranges');
const unique = require('../../lib/utils/unique');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class ReportLineChangeTime {
author = `*${this.constructor.name}*`;
constructor () {
DEBUG(`${this.author} instantiated`);
}
async run (data, ctx) {
if (!data || data.channel !== "event") {
return;
}
const n = data.payload.new;
const o = data.payload.old;
if (!(n?.labels) && !(o?.labels)) {
return;
}
if (!n?.labels?.includes("FGSP") && !o?.labels?.includes("FGSP") &&
!n?.labels?.includes("LGSP") && !o?.labels?.includes("LGSP")) {
return;
}
try {
DEBUG("Running");
const cur = data;
const projectId = cur?.payload?.pid;
const forward = (cur?.payload?.old?.labels?.includes("LGSP") || cur?.payload?.new?.labels?.includes("LGSP"));
DEBUG("%j", cur);
if (!projectId) {
throw {message: "No projectID found in event", cur};
return;
}
async function getLineChangeTime (data, forward = false) {
if (forward) {
const ospEvents = await event.list(projectId, {label: "FGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp > data.tstamp).pop();
DEBUG("fsp", osp);
// DEBUG("data", data);
if (osp) {
DEBUG("lineChangeTime", osp.tstamp - data.tstamp);
return { lineChangeTime: osp.tstamp - data.tstamp, osp };
}
} else {
const ospEvents = await event.list(projectId, {label: "LGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp < data.tstamp).shift();
DEBUG("lsp", osp);
// DEBUG("data", data);
if (osp) {
DEBUG("lineChangeTime", data.tstamp - osp.tstamp);
return { lineChangeTime: data.tstamp - osp.tstamp, osp };
}
}
}
function parseInterval (dt) {
const daySeconds = (dt/1000) % 86400;
const d = Math.floor((dt/1000) / 86400);
const dateObject = new Date(null);
dateObject.setSeconds(daySeconds);
const [ h, m, s ] = dateObject.toISOString().slice(11, 19).split(":").map(Number);
return {d, h, m, s};
}
function formatInterval (i) {
let str = "";
for (let [k, v] of Object.entries(i)) {
if (v) {
str += " " + v + " " + k;
}
}
return str.trim();
}
const deleteStaleEvents = async (seq) => {
if (seq) {
DEBUG("Will delete lct events related to sequence(s)", seq);
const jpq = `$."${this.author}"`;
const opts = {jpq};
if (Array.isArray(seq)) {
opts.sequences = unique(seq).filter(i => !!i);
} else {
opts.sequence = seq;
}
const staleEvents = await event.list(projectId, opts);
DEBUG(staleEvents.length ?? 0, "events to delete");
for (let staleEvent of staleEvents) {
DEBUG(`Deleting event id ${staleEvent.id} (seq = ${staleEvent.sequence}, point = ${staleEvent.point})`);
await event.del(projectId, staleEvent.id);
}
}
}
const createLineChangeTimeEvents = async (lineChangeTime, data, osp) => {
const events = [];
const cfg = ctx?.projects?.configuration?.[projectId] ?? {};
const nlcd = cfg?.production?.nominalLineChangeDuration * 60*1000; // m → ms
DEBUG("nlcd", nlcd);
if (nlcd && lineChangeTime > nlcd) {
const excess = lineChangeTime-nlcd;
const excessString = formatInterval(parseInterval(excess));
DEBUG("excess", excess, excessString);
// ref: The later of the two events
const ref = forward ? osp : data;
const payload = {
// tstamp: new Date(ref.tstamp-1),
sequence: ref.sequence,
point: ref.point,
remarks: `_Nominal line change duration exceeded by ${excessString}_`,
labels: [ "Nav", "Prod" ],
meta: {
auto: true,
author: this.author,
[this.author]: {
parents: [
data.id,
osp.id
],
type: "excess",
value: excess
}
}
}
events.push(payload);
DEBUG("Created line change duration exceeded event", projectId, payload);
}
const lctString = formatInterval(parseInterval(lineChangeTime));
// ref: The later of the two events
const ref = forward ? osp : data;
const payload = {
// tstamp: new Date(ref.tstamp-1),
sequence: ref.sequence,
point: ref.point,
remarks: `Line change time: ${lctString}`,
labels: [ "Nav", "Prod" ],
meta: {
auto: true,
author: this.author,
[this.author]: {
parents: [
data.id,
osp.id
],
type: "lineChangeTime",
value: lineChangeTime
}
}
};
events.push(payload);
DEBUG("Created line change duration event", projectId, payload);
return events;
}
const maybePostEvent = async (projectId, payload) => {
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
}
await deleteStaleEvents([cur.old?.sequence, cur.new?.sequence]);
if (cur?.payload?.operation == "INSERT") {
// NOTE: UPDATE on the event_log view translates to one UPDATE plus one INSERT
// on event_log_full, so we don't need to worry about UPDATE here.
const data = n;
DEBUG("INSERT seen: will add lct events related to ", data.id);
if (withinValidity(data.validity)) {
DEBUG("Event within validity period", data.validity, new Date());
data.tstamp = new Date(data.tstamp);
const { lineChangeTime, osp } = await getLineChangeTime(data, forward);
if (lineChangeTime) {
const events = await createLineChangeTimeEvents(lineChangeTime, data, osp);
if (events?.length) {
DEBUG("Deleting other events for sequence", events[0].sequence);
await deleteStaleEvents(events[0].sequence);
}
for (let payload of events) {
await maybePostEvent(projectId, payload);
}
}
} else {
DEBUG("Event outside of validity range", data.validity, "lct events not inserted");
}
}
} catch (err) {
ERROR(`${this.author} error`, err);
throw err;
}
}
}
module.exports = ReportLineChangeTime;

View File

@@ -1,23 +1,25 @@
const { listen } = require('../lib/db/notify');
const channels = require('../lib/db/channels');
const handlers = require('./handlers').init();
const handlers = require('./handlers');
const { ActionsQueue } = require('../lib/queue');
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
function start () {
listen(channels, async function (data) {
const queue = new ActionsQueue();
const ctx = {}; // Context object
const { prepare, despatch } = handlers.init(ctx);
listen(channels, function (data) {
DEBUG("Incoming data", data);
for (const handler of handlers) {
// NOTE: We are intentionally passing the same instance
// of the data to every handler. This means that earlier
// handlers could, in principle, modify the data to be
// consumed by latter ones, provided that they are
// synchronous (as otherwise, the completion order is
// undefined).
await handler.run(data);
}
// We don't bother awaiting
queue.enqueue(() => despatch(data, ctx));
DEBUG("Queue size", queue.length());
});
INFO("Events manager started.", handlers.length, "active handlers");
INFO("Events manager started");
}
module.exports = { start }

View File

@@ -8,23 +8,55 @@ async function main () {
INFO("Running version", await version.describe());
version.compatible()
.then( (versions) => {
const api = require('./api');
const ws = require('./ws');
try {
const api = require('./api');
const ws = require('./ws');
const periodicTasks = require('./periodic-tasks').init();
const { fork } = require('child_process');
const { fork } = require('child_process');
const port = process.env.HTTP_PORT || 3000;
const host = process.env.HTTP_HOST || "127.0.0.1";
const path = process.env.HTTP_PATH ?? "/api";
const server = api.start(port, host, path);
ws.start(server);
const port = process.env.HTTP_PORT || 3000;
const host = process.env.HTTP_HOST || "127.0.0.1";
const path = process.env.HTTP_PATH ?? "/api";
const server = api.start(port, host, path);
ws.start(server);
const eventManagerPath = [__dirname, "events"].join("/");
const eventManager = fork(eventManagerPath, /*{ stdio: 'ignore' }*/);
INFO("Versions:", versions);
INFO("Versions:", versions);
periodicTasks.start();
process.on('exit', () => eventManager.kill());
const eventManagerPath = [__dirname, "events"].join("/");
const eventManager = fork(eventManagerPath, /*{ stdio: 'ignore' }*/);
process.on("SIGINT", async () => {
DEBUG("Interrupted (SIGINT)");
eventManager.kill()
await periodicTasks.cleanup();
process.exit(0);
})
process.on("SIGHUP", async () => {
DEBUG("Stopping (SIGHUP)");
eventManager.kill()
await periodicTasks.cleanup();
process.exit(0);
})
process.on('beforeExit', async () => {
DEBUG("Preparing to exit");
eventManager.kill()
await periodicTasks.cleanup();
});
process.on('exit', async () => {
DEBUG("Exiting");
// eventManager.kill()
// periodicTasks.cleanup();
});
} catch (err) {
ERROR(err);
process.exit(2);
}
})
.catch( ({current, wanted, component}) => {
console.error(`Fatal error: incompatible ${component} version ${current} (wanted: ${wanted})`);

View File

@@ -0,0 +1,61 @@
const { setSurvey } = require('../connection');
const { replaceMarkers } = require('../../utils');
function parseValidity (row) {
if (row.validity) {
const rx = /^(.)("([\d :.+-]+)")?,("([\d :.+-]+)")?([\]\)])$/;
const m = row.validity.match(rx);
row.validity = [ m[1], m[3], m[5], m[6] ];
}
return row;
}
function transform (row) {
if (row.validity[2]) {
return {
uid: row.uid,
id: row.id,
is_deleted: true
}
} else {
row.is_deleted = false;
row.has_edits = row.id != row.uid;
row.modified_on = row.validity[1];
delete row.uid;
delete row.validity;
return row;
}
}
function unique (rows) {
const o = {};
rows.forEach(row => o[row.id] = row);
return Object.values(o);
}
/**
* Get the event change history from a given epoch (ts0),
* for all events.
*/
async function changes (projectId, ts0, opts = {}) {
if (!projectId || !ts0) {
throw {status: 400, message: "Invalid request" };
return;
}
const client = await setSurvey(projectId);
const text = `
SELECT *
FROM event_log_changes($1);
`;
const res = await client.query(text, [ts0]);
client.release();
return opts.unique
? unique(res.rows.map(i => transform(replaceMarkers(parseValidity(i)))))
: res.rows.map(i => transform(replaceMarkers(parseValidity(i))));
}
module.exports = changes;

View File

@@ -5,5 +5,6 @@ module.exports = {
post: require('./post'),
put: require('./put'),
patch: require('./patch'),
del: require('./delete')
del: require('./delete'),
changes: require('./changes')
}

View File

@@ -10,25 +10,34 @@ async function list (projectId, opts = {}) {
const offset = Math.abs((opts.page-1)*opts.itemsPerPage) || 0;
const limit = Math.abs(Number(opts.itemsPerPage)) || null;
const filter = opts.sequence
? String(opts.sequence).includes(";")
? [ "sequence = ANY ( $1 )", [ opts.sequence.split(";") ] ]
: [ "sequence = $1", [ opts.sequence ] ]
: opts.date0
? opts.date1
? [ "date(tstamp) BETWEEN SYMMETRIC $1 AND $2", [ opts.date0, opts.date1 ] ]
: [ "date(tstamp) = $1", [ opts.date0 ] ]
: [ "true = true", [] ];
const sequence = opts.sequence && Number(opts.sequence) || null;
const sequences = opts.sequences && (Array.isArray(opts.sequences)
? opts.sequences.map(Number)
: opts.sequences.split(/[^0-9]+/).map(Number)) || null;
const date0 = opts.date0 ?? null;
const date1 = opts.date1 ?? null;
const jpq = opts.jpq || null; // jpq: JSONPath Query
const label = opts.label ?? null;
const text = `
SELECT *
FROM event_log e
WHERE
${filter[0]}
ORDER BY ${sortKey} ${sortDir};
($1::numeric IS NULL OR sequence = $1) AND
($2::numeric[] IS NULL OR sequence = ANY( $2 )) AND
($3::timestamptz IS NULL OR date(tstamp) = $3) AND
($3::timestamptz IS NULL OR
(($4::timestamptz IS NULL AND date(tstamp) = $3) OR
date(tstamp) BETWEEN SYMMETRIC $3 AND $4)) AND
($5::jsonpath IS NULL OR jsonb_path_exists(meta::jsonb, $5::jsonpath)) AND
($6::text IS NULL OR $6 = ANY(labels))
ORDER BY ${sortKey} ${sortDir}
LIMIT ${limit};
`;
const res = await client.query(text, filter[1]);
const values = [ sequence, sequences, date0, date1, jpq, label ];
const res = await client.query(text, values);
client.release();
return res.rows.map(i => replaceMarkers(i));
}

View File

@@ -9,10 +9,10 @@ async function post (projectId, payload, opts = {}) {
const text = `
INSERT
INTO event_log (tstamp, sequence, point, remarks, labels)
VALUES ($1, $2, $3, replace_placeholders($4, $1, $2, $3), $5);
INTO event_log (tstamp, sequence, point, remarks, labels, meta)
VALUES ($1, $2, $3, replace_placeholders($4, $1, $2, $3), $5, $6);
`;
const values = [ p.tstamp, p.sequence, p.point, p.remarks, p.labels ];
const values = [ p.tstamp, p.sequence, p.point, p.remarks, p.labels, p.meta ];
DEBUG("Inserting new values: %O", values);
await client.query(text, values);

View File

@@ -1,17 +1,43 @@
// FIXME This code is in painful need of refactoring
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const { setSurvey, transaction, pool } = require('../connection');
const { listen } = require('../notify');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
let last_tstamp = 0;
async function getAllProjectConfigs () {
const client = await pool.connect();
let project_configs, listener;
const text = `SELECT schema, meta AS data FROM projects;`;
const res = await client.query(text);
client.release();
return res.rows;
async function getAllProjectConfigs () {
async function getFromDatabase () {
DEBUG("Getting project configurations");
const client = await pool.connect();
try {
const text = `
SELECT schema, meta AS data
FROM projects
WHERE (meta->>'archived')::boolean IS NOT true;
`;
const res = await client.query(text);
project_configs = res.rows;
DEBUG("Have configurations for projects", project_configs.map(i => i.data.id));
} catch (err) {
ERROR(err);
} finally {
client.release();
}
return project_configs;
}
if (project_configs) {
return project_configs;
} else {
listener = await listen(["project"], getFromDatabase);
DEBUG("Added project configuration change listener");
return await getFromDatabase();
}
}
async function getNearestPreplot (candidates) {
@@ -237,7 +263,7 @@ async function getCandidates (navData) {
});
return obj;
}).filter(c => !!c);
DEBUG("Candidates: %j", candidates.map(c => c.schema));
// DEBUG("Candidates: %j", candidates.map(c => c.schema));
return candidates;
}
@@ -269,7 +295,7 @@ async function save (navData, opts = {}) {
// Only one candidate, associate with it
// console.log("Save into schema", candidates[0].match.schema);
await saveOnline(candidates);
navData.payload._schema = candidates[0].match.schema;
navData.payload._schema = candidates[0].schema;
} else {
// More than one candidate, go for the closest. If more than one active
// project with the same preplots, highest numbered schema.
@@ -309,6 +335,7 @@ async function save (navData, opts = {}) {
}
await saveOffline(navData, opts);
DEBUG("Saved");
}
module.exports = save;

View File

@@ -1,5 +1,43 @@
const { makeSubscriber } = require('./connection');
const { makeSubscriber, pool } = require('./connection');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
async function purge () {
DEBUG("Purging old notifications");
const client = await pool.connect();
try {
await client.query("CALL purge_notifications();");
} catch (err) {
ERROR(err);
} finally {
client.release();
}
}
async function fullPayload (payload) {
if (!payload.payload_id) {
return payload;
} else {
let client, res;
try {
client = await pool.connect();
const text = `SELECT payload FROM notify_payloads WHERE id = $1;`;
const values = [ payload.payload_id ];
res = await client.query(text, values);
res = res?.rows[0]?.payload;
DEBUG(`Oversize notification payload retrieved with id ${payload.payload_id} and size ${res.length}`);
// DEBUG(res);
res = JSON.parse(res);
} catch (err) {
ERROR(err);
} finally {
if (client) {
client.release();
}
}
return res;
}
}
async function listen (addChannels, callback) {
@@ -18,11 +56,11 @@ async function listen (addChannels, callback) {
for (const channel of addChannels) {
await client.listenTo(channel);
client.notifications.on(channel, (payload) => {
client.notifications.on(channel, async (payload) => {
const data = {
channel,
_received: new Date(),
payload
payload: await fullPayload(payload)
};
callback(data);
});
@@ -32,5 +70,6 @@ async function listen (addChannels, callback) {
}
module.exports = {
listen
listen,
purge
};

View File

@@ -0,0 +1,52 @@
const Queue = require('./queue');
// Inspired by:
// https://stackoverflow.com/questions/53540348/js-async-await-tasks-queue#53540586
class ActionsQueue extends Queue {
constructor (items = []) {
super(items);
this.pending = false;
}
enqueue (action) {
return new Promise ((resolve, reject) => {
super.enqueue({ action, resolve, reject });
this.dequeue();
});
}
async dequeue () {
if (this.pending) {
return false;
}
const item = super.dequeue();
if (!item) {
return false;
}
try {
this.pending = true;
const result = await item.action(this);
this.pending = false;
item.resolve(result);
} catch (err) {
this.pending = false;
item.reject(err);
} finally {
this.dequeue();
}
}
}
module.exports = ActionsQueue;

View File

@@ -0,0 +1,6 @@
module.exports = {
Queue: require('./queue'),
ActionsQueue: require('./actions-queue')
};

View File

@@ -0,0 +1,22 @@
class Queue {
constructor (items = []) {
this.items = items;
}
enqueue (item) {
this.items.push(item);
}
dequeue () {
return this.items.shift();
}
length () {
return this.items.length;
}
}
module.exports = Queue;

View File

@@ -6,5 +6,7 @@ module.exports = {
flattenQCDefinitions: require('./flattenQCDefinitions'),
deepMerge: require('./deepMerge'),
removeNulls: require('./removeNulls'),
logicalPath: require('./logicalPath')
logicalPath: require('./logicalPath'),
ranges: require('./ranges'),
unique: require('./unique')
};

View File

@@ -0,0 +1,74 @@
function parseRange (str) {
const rx = /^[\[(].*,.*[)\]]$/
if (rx.test(str)) {
const lower_inclusive = str[0] == '[';
const upper_inclusive = str[str.length-1] == ']';
const [ lower, upper ] = str.slice(1,-1).split(",");
return {
upper,
lower,
upper_inclusive,
lower_inclusive
};
}
}
function parseValidity (str) {
const range = parseRange(str);
if (range) {
ts0 = range.lower ? new Date(range.lower) : null;
ts1 = range.upper ? new Date(range.upper) : null;
return {
...range,
lower: ts0,
upper: ts1
};
}
}
function withinValidity (range, ts) {
if (!ts) {
ts = new Date();
}
if (typeof range === "string") {
range = parseValidity(range);
}
if (range.lower) {
if (range.lower_inclusive) {
if (!(range.lower <= ts)) {
return false;
}
} else {
if (!(range.lower < ts)) {
return false;
}
}
}
if (range.upper) {
if (range.upper_inclusive) {
if (!(range.upper >= ts)) {
return false;
}
} else {
if (!(range.upper > ts)) {
return false;
}
}
}
return true;
}
module.exports = {
parseRange,
parseValidity,
withinValidity
}

View File

@@ -0,0 +1,6 @@
function unique(array) {
return [...new Set(array)];
}
module.exports = unique;

View File

@@ -0,0 +1,38 @@
const tasks = require('./tasks');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
function init () {
const iids = [];
function start () {
INFO("Initialising %d periodic tasks", tasks.length);
for (let t of tasks) {
const iid = setInterval(t.task, t.timeout);
iids.push(iid);
}
return iids;
};
function stop () {
INFO("Stopping %d periodic tasks", iids.length);
for (let iid of iids) {
clearInterval(iid);
}
}
async function cleanup () {
stop();
DEBUG("Cleaning up %d periodic tasks", tasks.length);
for (let t of tasks) {
if (t.cleanup) {
await t.cleanup();
}
}
}
return { start, stop, cleanup, iids };
}
module.exports = {
init
};

View File

@@ -0,0 +1,4 @@
module.exports = [
require('./purge-notifications')
];

View File

@@ -0,0 +1,20 @@
const { purge } = require('../../lib/db/notify');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
const timeout = 120*1000; // 2 minutes
function task () {
DEBUG("Running task");
purge();
}
async function cleanup () {
DEBUG("Running cleanup");
await purge();
}
module.exports = {
task,
timeout,
cleanup
};

View File

@@ -180,6 +180,16 @@ components:
required: true
example: 14707
Since:
description: Starting epoch
name: since
in: path
schema:
type: string
format: date-time
required: true
example: 1970-01-01T00:00:00Z
QueryLimit:
description: Maximum number of results to return
name: limit
@@ -206,6 +216,16 @@ components:
pattern: "(([^\\s,;:]+)(\\s*[,;:\\s]\\s*)?)+"
example: "line,point,tstamp"
Unique:
description: |
Return unique results. Any value at all represents `true`.
name: unique
in: query
schema:
type: string
pattern: ".+"
example: "t"
schemas:
Duration:
@@ -602,14 +622,26 @@ components:
Flag to indicate that this event is read-only. It cannot be edited by the user or deleted. Typically this concerns system-generated events such as QC results or midnight shots.
additionalProperties: true
EventIDAbstract:
type: object
properties:
id:
type: number
description: Event ID.
EventUIDAbstract:
type: object
properties:
uid:
type: number
description: Event instance unique ID. When an event is modified, the new entry acquires a different `uid` while keeping the same `id` as the original event.
EventAbstract:
allOf:
-
type: object
properties:
id:
type: number
description: Event ID.
$ref: "#/components/schemas/EventIDAbstract"
-
$ref: "#/components/schemas/EventNew"
@@ -659,6 +691,47 @@ components:
* The third element is either an ISO-8601 timestamp or `null`. The latter indicates +∞. These are the events returned by endpoints that do not concern themselves with event history.
* The fourth element is one of `]` or `)`. As before, it indicates either an open or closed interval.
EventChangesIsDeletedAbstract:
type: object
properties:
is_deleted:
type: boolean
description: >
Flag to indicate whether this event or event instance (depending on the presence of a `uid` attribute) has been deleted.
EventChangesModified:
description: An event modification.
allOf:
-
$ref: "#/components/schemas/EventAbstract"
-
$ref: "#/components/schemas/EventChangesIsDeletedAbstract"
EventChangesDeleted:
description: |
Identification of a deleted event or event instance.
**Note:** the details of the deleted event are not included, only its `id` and `uid`.
allOf:
-
$ref: "#/components/schemas/EventIDAbstract"
-
$ref: "#/components/schemas/EventUIDAbstract"
-
$ref: "#/components/schemas/EventChangesIsDeletedAbstract"
EventChanges:
description: List of event changes since the given epoch.
type: array
items:
anyOf:
-
$ref: "#/components/schemas/EventChangesDeleted"
-
$ref: "#/components/schemas/EventChangesModified"
SeisExportEntryFSP:
type: object
properties:
@@ -1159,9 +1232,55 @@ paths:
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/PlannedSequence"
type: object
properties:
remarks:
type: string
description: Planner remarks
sequences:
type: array
items:
$ref: "#/components/schemas/PlannedSequence"
text/csv:
schema:
type: string
format: csv
description: |
Returns a CSV response containing one row for each planned sequence, with the following columns:
* `sequence`: Sequence number
* `line`: Line number
* `fsp`: First shotpoint
* `lsp`: Last shotpoint
* `ts0`: Estimated timestamp of the first shotpoint
* `ts1`: Estimated timestamp of the last shotpoint
* `name`: Line name
* `remarks`: Arbitrary comments
* `num_points`: Number of shotpoints
* `duration`: Estimated duration in seconds
* `length`: Line length in metres
* `azimuth`: Line azimuth
* `lon0`: Longitude of the first shotpoint
* `lat0`: Latitude of the first shotpoint
* `lon1` Longitude of the last shotpoint
* `lat1`: Latitude of the last shotpoint
example: |
"sequence","line","fsp","lsp","ts0","ts1","name","remarks","num_points","duration","length","azimuth","lon0","lat0","lon1","lat1"
81,5162,2422,1158,"2023-10-22T11:09:24.912Z","2023-10-22T12:56:03.395Z","2051621081S00000","",633,6398,15799.988472147348,26.4703415983101,2.474872,59.086695,2.596266,59.214146
82,5178,2444,1146,"2023-10-22T12:56:03.000Z","2023-10-22T14:45:33.607Z","2051781082S00000","",650,6570,16225.02094944685,26.470137885560813,2.469632,59.085264,2.594277,59.216147
text/html:
schema:
type: string
format: html
description: |
An HTML representation of the plan.
application/pdf:
schema:
type: string
contentMediaType: application/pdf
description: |
A PDF representation of the plan.
post:
description: Add a new sequence to the plan.
@@ -1382,6 +1501,31 @@ paths:
$ref: "#/components/responses/401"
/project/{project}/changes/{since}:
get:
summary: Get event change history since epoch.
tags: [ "log" ]
security:
- BearerAuthGuest: []
- CookieAuthGuest: []
parameters:
- $ref: "#/components/parameters/Project"
- $ref: "#/components/parameters/Since"
- $ref: "#/components/parameters/Unique"
responses:
"200":
description: List of project event changes. If `unique` is given, only the latest version of each event will be returned, otherwise the entire modification history is given, potentially including the same event `id` multiple times.
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/EventChanges"
"401":
$ref: "#/components/responses/401"
/project/{project}/label:
get:
summary: Get project labels.

View File

@@ -95,7 +95,8 @@ for (const header of (cfg._("global.navigation.headers") || []).filter(h => h.ty
const server = dgram.createSocket('udp4');
server.on('error', (err) => {
console.error(`server error:\n${err.stack}`);
ERROR(err);
// console.error(`server error:\n${err.stack}`);
maybeSendError(err, {title: "UDP listener error on port "+header.port});
// server.close();
});