Compare commits

...

51 Commits

Author SHA1 Message Date
D. Berge
12a762f44f Fix typo in @dougal/binary 2025-08-16 14:55:53 +02:00
D. Berge
ebf13abc28 Merge branch '337-fix-event-queue' into 'devel'
Resolve "Automatic event detection fault: soft start on every shot during line"

Closes #337

See merge request wgp/dougal/software!61
2025-08-16 12:55:15 +00:00
D. Berge
b3552db02f Add error checking to ETag logic 2025-08-16 11:36:43 +02:00
D. Berge
cd882c0611 Add debug info to soft start detection 2025-08-16 11:36:43 +02:00
D. Berge
6fc9c020a4 Fix off-by-one error in LGSP detection 2025-08-16 11:36:43 +02:00
D. Berge
75284322f1 Modify full volume detection on Smartsource
The Smartsource firmware seems to have changed rendering the old
test invalid.
2025-08-16 11:36:43 +02:00
D. Berge
e849c47f01 Remove old queue implementation 2025-08-16 11:36:43 +02:00
D. Berge
387d20a4f0 Rewrite automatic event handling system 2025-08-16 11:36:43 +02:00
D. Berge
2fab06d340 Don't send timestamp when patching seq+point events.
Closes #339.
2025-08-16 11:35:35 +02:00
D. Berge
7d2fb5558a Hide switches to enable additional graphs.
All violin plots as well as position scatter plots and histograms
are shown by default. This is due to #338.

For some reason, having them enabled from the get go does not
cause any problems.
2025-08-15 18:09:51 +02:00
D. Berge
764e2cfb23 Rename endpoint 2025-08-14 13:34:36 +02:00
D. Berge
bf1af1f76c Make it explicit that :id is numeric 2025-08-14 13:34:27 +02:00
D. Berge
09e4cd2467 Add CSV event import.
Closes #336
2025-08-14 13:33:30 +02:00
D. Berge
2009d73a2b Fix action registration and unregistration 2025-08-13 17:03:00 +02:00
D. Berge
083ee812de Use cookies for authentication as a last resort.
Fixes #335
2025-08-13 16:54:38 +02:00
D. Berge
84510e8dc9 Add proper logging 2025-08-13 15:42:49 +02:00
D. Berge
7205ec42a8 Fix handler registration.
The way it was being done meant that unregisterHandlers would not
have worked.
2025-08-13 15:42:49 +02:00
D. Berge
73d85ef81f Fix scheduling of token refresh via websocket 2025-08-13 12:58:36 +02:00
D. Berge
6c4dc35461 Fix bad status on preplot lines tab
If there were no raw / final sequences on a line, planned sequences
would not show either.
2025-08-13 12:45:50 +02:00
D. Berge
a5ebff077d Fix authentication middleware erroring on IPv6 2025-08-13 11:50:20 +02:00
D. Berge
2a894692ce Throttle snack notifications 2025-08-12 00:22:09 +02:00
D. Berge
25690eeb52 Fix showSnack in main.js 2025-08-11 23:48:08 +02:00
D. Berge
3f9776b61d Let the user know when we're getting gateway errors 2025-08-11 23:47:25 +02:00
D. Berge
8c81daefc0 Move the two /configuration endpoints next to each other 2025-08-11 22:20:46 +02:00
D. Berge
c173610e87 Simplify middleware 2025-08-11 22:19:51 +02:00
D. Berge
301e5c0731 Set headers only on 304 2025-08-11 22:06:51 +02:00
D. Berge
48d9f45fe0 Clean up debug messages 2025-08-11 22:06:20 +02:00
D. Berge
cd23a78592 Merge branch '190-refactor-map' into 'devel'
Resolve "Refactor map"

Closes #190, #322, #323, #324, #325, #326, and #321

See merge request wgp/dougal/software!25
2025-08-11 13:01:00 +00:00
D. Berge
e368183bf0 Show release notes for previous versions too 2025-08-11 14:59:22 +02:00
D. Berge
02477b071b Compress across the board.
It's still subject to the compression module's filters, but now
we try to compress every response in principle.
2025-08-11 13:57:11 +02:00
D. Berge
6651868ea7 Enable compression for vessel track responses 2025-08-11 13:40:53 +02:00
D. Berge
c0b52a8245 Be more aggressive about what gets compressed 2025-08-11 12:42:48 +02:00
D. Berge
90ce6f063e Remove dead code 2025-08-11 02:31:43 +02:00
D. Berge
b2fa0c3d40 Flatten vesselTrackConfig for better reactivity 2025-08-11 02:31:12 +02:00
D. Berge
83ecaad4fa Change vessel colour 2025-08-11 01:57:40 +02:00
D. Berge
1c5fd2e34d Calculate properly first / last timestamps of vessel tracks 2025-08-11 01:56:46 +02:00
D. Berge
aabcc74891 Add compression to some endpoints.
Consideration will be given to adding (conditional) compression
to all endpoints.
2025-08-11 01:53:50 +02:00
D. Berge
2a7b51b995 Squash another cookie 2025-08-11 01:52:04 +02:00
D. Berge
5d19ca7ca7 Add authentication to vessel track request 2025-08-10 22:03:25 +02:00
D. Berge
910195fc0f Comment out "Map settings" control on map.
Not sure it will actually be used, after all.
2025-08-10 21:53:55 +02:00
D. Berge
6e5570aa7c Add missing require 2025-08-10 21:53:04 +02:00
D. Berge
595c20f504 Add vessel position to map.
Updates via websocket using the `realtime` channel notification
message.
2025-08-10 21:52:02 +02:00
D. Berge
40d0038d80 Add vessel track layer to map.
Track length may be changed by clicking on the appropriate icon.
2025-08-10 21:47:43 +02:00
D. Berge
acdf118a67 Add new /vessel/track endpoints.
This is a variation on /navdata but returns data more suitable
for plotting vessel tracks on the map.
2025-08-10 21:39:35 +02:00
D. Berge
b9e0975d3d Add clone routine to project DB lib (WIP).
This relates to #333.
2025-08-10 21:37:12 +02:00
D. Berge
39d9c9d748 Fix GeoJSON returned by /navdata endpoint 2025-08-10 21:36:37 +02:00
D. Berge
b8b25dcd62 Update IP getter script to return LAN address.
get-ip.sh internet: returns the first IP address found that has
internet access.

get-ip.sh local (or no argument): returns the list of non-loopback
IPs minus the one that has internet access.

This means that update-dns.sh now sends the first IP address that
does *not* have internet access.
2025-08-09 22:27:23 +02:00
D. Berge
db97382758 Add scripts to automatically update the LAN DNS records.
./sbin/update-dns.sh may be called at regular intervals (one hour
or so) via crontab.

It will automatically detect:
- its local host name (*.lan.dougal.aaltronav.eu); and
- which IP has internet access, if any.

Armed with that information and with the dynamic DNS API password
stored in DYNDNS_PASSWD in ~/.dougalrc, it will update the relevant
DNS record.

For this to work, the first `lan.dougal` hostname in the Nginx
configuration must be the one that is set up for dynamic update.
Other `lan.dougal` hostnames should be CNAME records pointing to
the first one.
2025-08-09 18:37:15 +02:00
D. Berge
ae8e5d4ef6 Do not use cookies for backend authentication 2025-08-09 12:43:17 +02:00
D. Berge
2c1a24e4a5 Do not store JWT in document.cookie 2025-08-09 12:14:17 +02:00
D. Berge
0b83187372 Provide authorisation details to Deck.gl layers.
Those layers that call API endpoints directly no longer need to
rely on cookies as they use the JWT token directly via the
`Authorization` header.
2025-08-09 12:12:24 +02:00
66 changed files with 408620 additions and 472 deletions

View File

@@ -693,7 +693,7 @@ class DougalBinaryChunkSequential extends ArrayBuffer {
getRecord (index) {
if (index < 0 || index >= this.jCount) throw new Error(`Invalid record index: ${index}`);
const arr = [thid.udv, this.i, this.j0 + index * this.Δj];
const arr = [this.udv, this.i, this.j0 + index * this.Δj];
for (let m = 0; m < this.ΔelemCount; m++) {
const values = this.Δelem(m);

View File

@@ -9,10 +9,12 @@
"dependencies": {
"@deck.gl/aggregation-layers": "^9.1.13",
"@deck.gl/geo-layers": "^9.1.13",
"@deck.gl/mesh-layers": "^9.1.14",
"@dougal/binary": "file:../../../modules/@dougal/binary",
"@dougal/concurrency": "file:../../../modules/@dougal/concurrency",
"@dougal/organisations": "file:../../../modules/@dougal/organisations",
"@dougal/user": "file:../../../modules/@dougal/user",
"@loaders.gl/obj": "^4.3.4",
"@mdi/font": "^7.2.96",
"buffer": "^6.0.3",
"core-js": "^3.6.5",

File diff suppressed because it is too large Load Diff

View File

@@ -92,18 +92,12 @@ export default {
this.$store.dispatch('registerHandler', {
table: '.jwt',
handler: (context, message) => {
this.handleJWT(context, message);
}
handler: this.handleJWT
});
this.$store.dispatch('registerHandler', {
table: 'project',
handler: (context, message) => {
this.handleProject(context, message);
}
handler: this.handleProject
});
},

View File

@@ -10,7 +10,10 @@
<v-spacer></v-spacer>
<template v-if="isFrontendRemote">
<v-icon v-if="serverConnected" class="mr-6" title="Connected to server via gateway">mdi-cloud-outline</v-icon>
<template v-if="serverConnected">
<v-icon v-if="isGatewayReliable" class="mr-6" title="Connected to server via gateway">mdi-cloud-outline</v-icon>
<v-icon v-else class="mr-6" color="orange" title="Gateway connection is unreliable. Expect outages.">mdi-cloud-off</v-icon>
</template>
<v-icon v-else class="mr-6" color="red" :title="`Server connection lost: the gateway cannot reach the remote server.\nWe will reconnect automatically when the link with the remote server is restored.`">mdi-cloud-off</v-icon>
</template>
<template v-else>
@@ -57,6 +60,13 @@ export default {
DougalNotificationsControl
},
data () {
return {
lastGatewayErrorTimestamp: 0,
gatewayErrorSilencePeriod: 60000,
}
},
computed: {
year () {
const date = new Date();
@@ -65,8 +75,24 @@ export default {
...mapState({
serverConnected: state => state.notify.serverConnected,
isFrontendRemote: state => state.api.serverInfo?.["remote-frontend"] ?? false
isFrontendRemote: state => state.api.serverInfo?.["remote-frontend"] ?? false,
isGatewayReliable: state => state.api.isGatewayReliable
})
},
watch: {
isGatewayReliable (val) {
if (val === false) {
const elapsed = Date.now() - this.lastGatewayErrorTimestamp;
const lastGatewayErrorTimestamp = Date.now();
if (elapsed > this.gatewayErrorSilencePeriod) {
this.$root.showSnack("Gateway error", "warning");
}
}
}
}
};
</script>

View File

@@ -3,8 +3,10 @@
<v-card-title class="headline">
Array inline / crossline error
<v-spacer></v-spacer>
<!--
<v-switch v-model="scatterplot" label="Scatterplot"></v-switch>
<v-switch class="ml-4" v-model="histogram" label="Histogram"></v-switch>
-->
</v-card-title>
<v-container fluid fill-height>
@@ -57,8 +59,8 @@ export default {
graph: [],
busy: false,
resizeObserver: null,
scatterplot: false,
histogram: false
scatterplot: true,
histogram: true
};
},

View File

@@ -3,8 +3,10 @@
<v-card-title class="headline">
Gun depth
<v-spacer></v-spacer>
<!--
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
-->
</v-card-title>
<v-container fluid fill-height>
@@ -59,7 +61,7 @@ export default {
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
violinplot: true
};
},

View File

@@ -3,8 +3,10 @@
<v-card-title class="headline">
Gun pressures
<v-spacer></v-spacer>
<!--
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
-->
</v-card-title>
<v-container fluid fill-height>
@@ -59,7 +61,7 @@ export default {
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
violinplot: true
};
},

View File

@@ -3,8 +3,10 @@
<v-card-title class="headline">
Gun timing
<v-spacer></v-spacer>
<!--
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
-->
</v-card-title>
<v-container fluid fill-height>
@@ -59,7 +61,7 @@ export default {
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
violinplot: true
};
},

View File

@@ -44,7 +44,16 @@
</v-card-title>
<v-card-text>
<pre>{{ versionHistory }}</pre>
<v-carousel v-model="releaseShown"
:continuous="false"
:cycle="false"
:show-arrows="true"
:hide-delimiters="true"
>
<v-carousel-item v-for="release in releaseHistory">
<pre>{{release}}</pre>
</v-carousel-item>
</v-carousel>
</v-card-text>
@@ -127,6 +136,8 @@ export default {
clientVersion: process.env.DOUGAL_FRONTEND_VERSION ?? "(unknown)",
serverVersion: null,
versionHistory: null,
releaseHistory: [],
releaseShown: null,
page: "support"
};
},
@@ -138,7 +149,8 @@ export default {
this.serverVersion = version?.tag ?? "(unknown)";
}
if (!this.versionHistory) {
const history = await this.api(['/version/history?count=1', {}, null, {silent:true}]);
const history = await this.api(['/version/history?count=3', {}, null, {silent:true}]);
this.releaseHistory = history;
this.versionHistory = history?.[this.serverVersion.replace(/-.*$/, "")] ?? null;
}
},

View File

@@ -1,8 +1,5 @@
<template>
<div class="line-status" v-if="sequences.length == 0">
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref || plannedSequenceHref || pendingReshootHref">
<div class="line-status" v-if="sequenceHref || plannedSequenceHref || pendingReshootHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence" v-if="sequenceHref"
class="sequence"
:class="sequence.status"
@@ -26,7 +23,7 @@
>
</router-link>
</div>
<div class="line-status" v-else>
<div class="line-status" v-else-if="sequences.length || plannedSequences.length || Object.keys(pendingReshoots).length">
<div v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
@@ -47,6 +44,9 @@
>
</div>
</div>
<div class="line-status" v-else>
<slot name="empty"></slot>
</div>
</template>
<style lang="stylus" scoped>

View File

@@ -62,9 +62,7 @@ new Vue({
showSnack(text, colour = "primary") {
console.log("showSnack", text, colour);
this.snackColour = colour;
this.snackText = text;
this.snack = true;
this.$store.dispatch("showSnack", [text, colour]);
},
sendJwt () {

View File

@@ -71,7 +71,7 @@ async function api ({state, getters, commit, dispatch}, [resource, init = {}, cb
res = await limiter.enqueue(async () => await fetch(url, init));
}
if (cache && !isCached) {
if (cache && !isCached && res.ok) { // Only cache successful responses
cache.put(url, res.clone());
}
@@ -95,6 +95,12 @@ async function api ({state, getters, commit, dispatch}, [resource, init = {}, cb
return [key, value];
});
state.serverInfo = entries.length ? Object.fromEntries(entries) : {};
if (state.serverInfo["remote-frontend"]) {
state.isGatewayReliable = ![ 502, 503, 504 ].includes(res.status);
} else {
state.isGatewayReliable = null;
}
}
if (res.ok) {

View File

@@ -2,7 +2,8 @@ const state = () => ({
apiUrl: "/api",
requestsCount: 0,
maxConcurrent: 15,
serverInfo: {} // Contents of the last received X-Dougal-Server HTTP header
serverInfo: {}, // Contents of the last received X-Dougal-Server HTTP header
isGatewayReliable: null, // True if we start seeing HTTP 502504 responses
});
export default state;

View File

@@ -80,4 +80,4 @@ function processServerEvent({ commit, dispatch, state, rootState }, message) {
state.debouncedRunners[table](message);
}
export default { registerHandler, processServerEvent };
export default { registerHandler, unregisterHandler, processServerEvent };

View File

@@ -30,4 +30,10 @@ function UNREGISTER_HANDLER(state, { table, handler }) {
}
export default { setServerEvent, clearServerEvent, setServerConnectionState, REGISTER_HANDLER };
export default {
setServerEvent,
clearServerEvent,
setServerConnectionState,
REGISTER_HANDLER,
UNREGISTER_HANDLER
};

View File

@@ -25,23 +25,10 @@ async function login ({ commit, dispatch }, loginRequest) {
async function logout ({ commit, dispatch }) {
commit('setToken', null);
commit('setUser', null);
commit('setCookie', {value: null});
await dispatch('api', ["/logout"]);
commit('setPreferences', {});
}
function setCookie(context, {name, value, expiry, path}) {
if (!name) name = "JWT";
if (!path) path = "/";
if (!value) value = "";
if (expiry) {
document.cookie = `${name}=${value}; expiry=${(new Date(expiry)).toUTCString()}; path=${path}`;
} else {
document.cookie = `${name}=${value}; path=${path}`;
}
}
function setCredentials({ state, commit, getters, dispatch, rootState }, { force, token, response } = {}) {
try {
let tokenValue = token;
@@ -59,17 +46,7 @@ function setCredentials({ state, commit, getters, dispatch, rootState }, { force
const decoded = jwt_decode(tokenValue);
commit('setToken', tokenValue);
commit('setUser', decoded ? new User(decoded, rootState.api.api) : null);
if (tokenValue && decoded) {
if (decoded?.exp) {
dispatch('setCookie', {value: tokenValue, expiry: decoded.exp*1000});
} else {
dispatch('setCookie', {value: tokenValue});
}
} else {
// Clear the cookie
dispatch('setCookie', {value: "", expiry: 0});
}
commit('setCookie', {name: "JWT", value: tokenValue, expires: (decoded.exp??0)*1000});
console.log('Credentials refreshed at', new Date().toISOString());
} else {
@@ -80,6 +57,7 @@ function setCredentials({ state, commit, getters, dispatch, rootState }, { force
if (err.name === 'InvalidTokenError') {
commit('setToken', null);
commit('setUser', null);
commit('clearCookie', "JWT")
}
}
dispatch('loadUserPreferences');
@@ -114,7 +92,6 @@ async function loadUserPreferences({ state, commit }) {
export default {
login,
logout,
setCookie,
setCredentials,
saveUserPreference,
loadUserPreferences

View File

@@ -7,12 +7,6 @@ function jwt (state) {
return state.token;
}
function cookie (state) {
if (state.token) {
return "JWT="+token;
}
}
function preferences (state) {
return state.preferences;
}

View File

@@ -16,4 +16,18 @@ function setPreferences (state, preferences) {
state.preferences = preferences;
}
export default { setToken, setUser, setPreferences };
function setCookie (state, opts = {}) {
const name = opts.name ?? "JWT";
const value = opts.value ?? "";
const expires = opts.expires ? (new Date(opts.expires)) : (new Date(0));
const path = opts.path ?? "/";
const sameSite = opts.sameSite ?? "Lax";
document.cookie = `${name}=${value};path=${path};SameSite=${sameSite};expires=${expires.toUTCString()}`;
}
function clearCookie (state, name) {
setCookie(state, {name});
}
export default { setToken, setUser, setPreferences, setCookie, clearCookie };

View File

@@ -737,6 +737,13 @@ export default {
if (event.id) {
const id = event.id;
delete event.id;
// If this is an edit, ensure that it is *either*
// a timestamp event or a sequence + point one.
if (event.sequence && event.point && event.tstamp) {
delete event.tstamp;
}
this.putEvent(id, event, callback); // No await
} else {
this.postEvent(event, callback); // No await

View File

@@ -31,7 +31,47 @@
<span>Vessel track</span>
<label title="Show points"><v-icon small left class="mx-0">mdi-vector-point</v-icon> <input type="checkbox" value="navp" v-model="layerSelection"/></label>
<!--
<label title="Show lines" disabled><v-icon small left class="mx-0">mdi-vector-line</v-icon> <input type="checkbox" value="navl" v-model="layerSelection"/></label>
-->
<div>
<v-menu bottom offset-y class="pb-1">
<template v-slot:activator="{ on, attrs }">
<v-icon style="margin-right: 3px;" small v-bind="attrs" v-on="on" :title="`Show lines.\nCurrently selected period: ${vesselTrackPeriodSettings[vesselTrackPeriod].title}. Click to change`">mdi-vector-line</v-icon>
</template>
<v-list nav dense>
<v-list-item @click="vesselTrackPeriod = 'hour'">
<v-list-item-content>
<v-list-item-title>Last hour</v-list-item-title>
</v-list-item-content>
</v-list-item>
<v-list-item @click="vesselTrackPeriod = 'hour6'">
<v-list-item-content>
<v-list-item-title>Last 6 hours</v-list-item-title>
</v-list-item-content>
</v-list-item>
<v-list-item @click="vesselTrackPeriod = 'hour12'">
<v-list-item-content>
<v-list-item-title>Last 12 hours</v-list-item-title>
</v-list-item-content>
</v-list-item>
<v-list-item @click="vesselTrackPeriod = 'day'">
<v-list-item-content>
<v-list-item-title>Last 24 hours</v-list-item-title>
</v-list-item-content>
</v-list-item>
<v-list-item @click="vesselTrackPeriod = 'week'">
<v-list-item-content>
<v-list-item-title>Last week</v-list-item-title>
</v-list-item-content>
</v-list-item>
</v-list>
</v-menu>
<input type="checkbox" value="navl" v-model="layerSelection"/>
</div>
<label><!-- No heatmap available --></label>
<span>Sail lines</span>
@@ -359,6 +399,7 @@
</v-select>
END QC data -->
<!--
<hr class="my-2"/>
<div title="Not yet implemented">
@@ -371,6 +412,7 @@
Map settings
</v-btn>
</div>
-->
</div>
</div>
@@ -620,6 +662,67 @@ export default {
maxPitch: 89
},
vesselPosition: null,
vesselTrackLastRefresh: 0,
vesselTrackRefreshInterval: 12, // seconds
vesselTrackIntervalID: null,
vesselTrackPeriod: "hour",
vesselTrackPeriodSettings: {
hour: {
title: "1 hour",
offset: 3600 * 1000,
decimation: 1,
refreshInterval: 18,
},
hour6: {
title: "6 hours",
offset: 6 * 3600 * 1000,
decimation: 1,
refreshInterval: 18,
},
hour12: {
title: "12 hours",
offset: 12 * 3600 * 1000,
decimation: 1,
refreshInterval: 18,
},
day: {
title: "24 hours",
offset: 24 * 3600 * 1000,
decimation: 12,
refreshInterval: 18,
},
week: {
title: "7 days",
offset: 7 * 24 * 3600 * 1000,
decimation: 60,
refreshInterval: 60,
},
week2: {
title: "14 days",
offset: 14 * 24 * 3600 * 1000,
decimation: 60,
refreshInterval: 90,
},
month: {
title: "30 days",
offset: 30 * 24 * 3600 * 1000,
decimation: 90,
refreshInterval: 120,
},
quarter: {
title: "90 days",
offset: 90 * 24 * 3600 * 1000,
decimation: 180,
refreshInterval: 300,
},
year: {
title: "1 year",
offset: 365 * 24 * 3600 * 1000,
decimation: 1200,
refreshInterval: 1800,
},
},
heatmapValue: "total_error",
isFullscreen: false,
crosshairsPositions: [],
@@ -761,6 +864,14 @@ export default {
deep: true
},
vesselTrackPeriod () {
this.updateVesselIntervalTimer();
},
vesselTrackLastRefresh () {
this.render();
},
lines () {
// Refresh map on change of preplot data
this.render();
@@ -1062,6 +1173,12 @@ export default {
return [[λ0 - , φ0 - ], [λ1 + , φ1 + ]];
},
// Returns the current second, as an integer.
// Used for triggering Deck.gl URL refreshes
currentSecond () {
return Math.floor(Date.now()/1000);
},
async getSequenceData (sequenceNumbers, types = [2, 3]) {
//const types = [2, 3]; // Bundle types: 2 → raw/gun data; 3 final data. See bundles.js
@@ -1407,6 +1524,19 @@ export default {
return arr.buffer;
},
updateVesselIntervalTimer (refreshInterval) {
this.vesselTrackRefreshInterval = refreshInterval ??
this.vesselTrackPeriodSettings[this.vesselTrackPeriod]?.refreshInterval ?? 0;
this.vesselTrackIntervalID = clearInterval(this.vesselTrackIntervalID);
if (this.vesselTrackRefreshInterval) {
this.vesselTrackLastRefresh = this.currentSecond();
this.vesselTrackIntervalID = setInterval( () => {
this.vesselTrackLastRefresh = this.currentSecond();
}, this.vesselTrackRefreshInterval * 1000);
}
},
async handleSequences (context, {payload}) {
if (payload.pid != this.$route.params.project) {
console.warn(`${this.$route.params.project} ignoring notification for ${payload.pid}`);
@@ -1438,18 +1568,46 @@ export default {
}
},
handleVesselPosition (context, {payload}) {
if (payload.new?.geometry?.coordinates) {
const now = Date.now();
const lastRefresh = this.vesselPosition?._lastRefresh;
// Limits refreshes to once every five seconds max
if (lastRefresh && (now-lastRefresh) < 5000) return;
this.vesselPosition = {
...payload.new.meta,
tstamp: payload.new.tstamp,
_lastRefresh: now
};
if (this.vesselPosition.lineStatus == "offline") {
this.vesselPosition.x = this.vesselPosition.longitude ?? payload.new.geometry.coordinates[0];
this.vesselPosition.y = this.vesselPosition.latitude ?? payload.new.geometry.coordinates[1];
} else {
this.vesselPosition.x = this.vesselPosition.longitudeMaster
?? payload.new.geometry.coordinates[0];
this.vesselPosition.y = this.vesselPosition.latitudeMaster
?? payload.new.geometry.coordinates[1];
}
this.render();
}
},
registerNotificationHandlers (action = "registerHandler") {
["raw_lines", "raw_shots", "final_lines", "final_shots"].forEach( table => {
this.$store.dispatch(action, {
table,
handler: (context, message) => {
this.handleSequences(context, message);
}
handler: this.handleSequences
})
});
this.$store.dispatch(action, {
table: 'realtime',
handler: this.handleVesselPosition
});
},
unregisterNotificationHandlers () {
@@ -1473,6 +1631,8 @@ export default {
console.log("TODO: Should switch to legacy map view");
}
this.updateVesselIntervalTimer();
this.layersAvailable.osm = this.osmLayer;
this.layersAvailable.sea = this.openSeaMapLayer;
@@ -1572,6 +1732,7 @@ export default {
beforeDestroy () {
this.unregisterNotificationHandlers();
this.vesselTrackIntervalID = this.clearInterval(this.vesselTrackIntervalID);
}
}

View File

@@ -5,8 +5,9 @@
import { Deck, WebMercatorViewport, FlyToInterpolator, CompositeLayer } from '@deck.gl/core';
import { GeoJsonLayer, LineLayer, PathLayer, BitmapLayer, ScatterplotLayer, ColumnLayer, IconLayer } from '@deck.gl/layers';
import {HeatmapLayer} from '@deck.gl/aggregation-layers';
import { TileLayer, MVTLayer } from '@deck.gl/geo-layers';
import { TileLayer, MVTLayer, TripsLayer } from '@deck.gl/geo-layers';
import { SimpleMeshLayer } from '@deck.gl/mesh-layers';
import { OBJLoader } from '@loaders.gl/obj';
//import { json } from 'd3-fetch';
import * as d3a from 'd3-array';
@@ -18,7 +19,6 @@ import DougalBinaryLoader from '@/lib/deck.gl/DougalBinaryLoader';
import { colors } from 'vuetify/lib'
function hexToArray (hex, defaultValue = [ 0xc0, 0xc0, 0xc0, 0xff ]) {
if (typeof hex != "string" || hex.length < 6) {
@@ -121,6 +121,21 @@ export default {
};
},
loadOptions (options = {}) {
return {
loadOptions: {
fetch: {
method: 'GET',
headers: {
'Authorization': `Bearer ${this.$store.getters.jwt}`,
}
},
...options
},
};
},
osmLayer (options = {}) {
return new TileLayer({
@@ -241,45 +256,99 @@ export default {
},
vesselTrackPointsLayer (options = {}) {
return new ScatterplotLayer({
if (!this.vesselPosition) return;
return new SimpleMeshLayer({
id: 'navp',
data: `/api/navdata?limit=10000`,
getPosition: (d) => ([d.longitude, d.latitude]),
getRadius: d => (d.speed),
radiusScale: 1,
lineWidthMinPixels: 2,
getFillColor: d => d.guns
? d.lineStatus == "online"
? [0xaa, 0x00, 0xff] // Online
: [0xd5, 0x00, 0xf9] // Soft start or guns otherwise active
: [0xea, 0x80, 0xfc], // Offline, guns inactive
getLineColor: [127, 65, 90],
getColor: [ 255, 0, 0 ],
getPointRadius: 12,
radiusUnits: "pixels",
pointRadiusMinPixels: 4,
stroked: false,
filled: true,
data: [ this.vesselPosition ],
//getColor: [ 255, 48, 0 ],
getColor: [ 174, 1, 174 ],
getOrientation: d => [0, (270 - (d.heading ?? d.cmg ?? d.bearing ?? d.lineBearing ?? 0)) % 360 , 0],
getPosition: d => [ d.x, d.y ],
mesh: `/assets/boat0.obj`,
sizeScale: 0.1,
loaders: [OBJLoader],
pickable: true,
...options
})
});
},
vesselTrackLinesLayer (options = {}) {
return new LineLayer({ // TODO Change to TrackLayer
const cfg = this.vesselTrackPeriodSettings[this.vesselTrackPeriod];
let ts1 = new Date(this.vesselTrackLastRefresh*1000);
let ts0 = new Date(ts1.valueOf() - cfg.offset);
let di = cfg.decimation;
let l = 10000;
const breakLimit = (di ? di*20 : 5 * 60) * 1000;
let trailLength = (ts1 - ts0) / 1000;
return new TripsLayer({
id: 'navl',
data: `/api/navdata?v=${Date.now()}`, // NOTE Not too sure about this
lineWidthMinPixels: 2,
getLineColor: (d) => d.properties.ntba ? [240, 248, 255, 200] : [85, 170, 255, 200],
getSourcePosition: (obj, i) => i.index < i.data?.length ? [i.data[i.index]?.longitude, i.data[i.index]?.latitude] : null,
getTargetPosition: (obj, i) => i.index < i.data?.length ? [i.data[i.index+1]?.longitude, i.data[i.index+1]?.latitude] : null,
getLineWidth: 3,
getPointRadius: 2,
radiusUnits: "pixels",
pointRadiusMinPixels: 2,
data: `/api/vessel/track/?di=${di}&l=${l}&project=&ts0=${ts0.toISOString()}&ts1=${ts1.toISOString()}`,
...this.loadOptions({
fetch: {
method: 'GET',
headers: {
Authorization: `Bearer ${this.$store.getters.jwt}`,
}
}
}),
dataTransform: (data) => {
if (data.length >= l) {
console.warn(`Vessel track data may be truncated! Limit: ${l}`);
}
const paths = [];
let prevTstamp;
paths.push({path: [], timestamps: [], num: 0, ts0: +Infinity, ts1: -Infinity});
for (const el of data) {
const tstamp = new Date(el.tstamp).valueOf();
const curPath = () => paths[paths.length-1];
if (prevTstamp && Math.abs(tstamp - prevTstamp) > breakLimit) {
// Start a new path
console.log(`Breaking path on interval ${Math.abs(tstamp - prevTstamp)} > ${breakLimit}`);
paths.push({path: [], timestamps: [], num: paths.length, ts0: +Infinity, ts1: -Infinity});
}
if (tstamp < curPath().ts0) {
curPath().ts0 = tstamp;
}
if (tstamp > curPath().ts1) {
curPath().ts1 = tstamp;
}
curPath().path.push([el.x, el.y]);
curPath().timestamps.push(tstamp/1000);
prevTstamp = tstamp;
}
paths.forEach (path => {
path.nums = paths.length;
path.ts0 = new Date(path.ts0);
path.ts1 = new Date(path.ts1);
});
return paths;
},
getPath: d => d.path,
getTimestamps: d => d.timestamps,
currentTime: ts1.valueOf() / 1000,
trailLength,
widthUnits: "meters",
widthMinPixels: 4,
getWidth: 10,
getColor: [ 174, 1, 126, 200 ],
stroked: true,
pickable: true,
...options
})
});
},
eventsLogLayer (options = {}) {
@@ -308,6 +377,7 @@ export default {
return new DougalEventsLayer({
id: 'log',
data: `/api/project/${this.$route.params.project}/event?mime=application/geo%2Bjson`,
...this.loadOptions(),
lineWidthMinPixels: 2,
getPosition: d => d.geometry.coordinates,
jitter: 0.00015,
@@ -332,6 +402,7 @@ export default {
return new GeoJsonLayer({
id: 'psll',
data: `/api/project/${this.$route.params.project}/gis/preplot/line?class=V&v=${this.lineTStamp?.valueOf()}`,
...this.loadOptions(),
lineWidthMinPixels: 1,
getLineColor: (d) => d.properties.ntba ? [240, 248, 255, 200] : [85, 170, 255, 200],
getLineWidth: 1,
@@ -347,6 +418,7 @@ export default {
return new GeoJsonLayer({
id: 'ppll',
data: `/api/project/${this.$route.params.project}/gis/preplot/line?v=${this.lineTStamp?.valueOf()}`,
...this.loadOptions(),
lineWidthMinPixels: 1,
getLineColor: (d) => d.properties.ntba ? [240, 248, 255, 200] : [85, 170, 255, 200],
getLineWidth: 1,
@@ -393,6 +465,7 @@ export default {
return new GeoJsonLayer({
id: 'seqrl',
data: `/api/project/${this.$route.params.project}/gis/raw/line?v=${this.sequenceTStamp?.valueOf()}`,
...this.loadOptions(),
lineWidthMinPixels: 1,
getLineColor: (d) => d.properties.ntbp ? [0xe6, 0x51, 0x00, 200] : [0xff, 0x98, 0x00, 200],
getLineWidth: 1,
@@ -408,6 +481,7 @@ export default {
return new GeoJsonLayer({
id: 'seqfl',
data: `/api/project/${this.$route.params.project}/gis/final/line?v=${this.sequenceTStamp?.valueOf()}`,
...this.loadOptions(),
lineWidthMinPixels: 1,
getLineColor: (d) => d.properties.pending ? [0xa7, 0xff, 0xab, 200] : [0x00, 0x96, 0x88, 200],
getLineWidth: 1,
@@ -424,12 +498,15 @@ export default {
id: 'pslp',
data: `/api/project/${this.$route.params.project}/line/sail?v=${this.lineTStamp?.valueOf()}`, // API endpoint returning binary data
loaders: [DougalBinaryLoader],
loadOptions: {
...this.loadOptions({
fetch: {
method: 'GET',
headers: { Accept: 'application/vnd.aaltronav.dougal+octet-stream' }
headers: {
Authorization: `Bearer ${this.$store.getters.jwt}`,
Accept: 'application/vnd.aaltronav.dougal+octet-stream'
}
}
},
}),
getRadius: 2,
getFillColor: (d, {data, index}) => data.attributes.value2.value[index] ? [240, 248, 255, 200] : [85, 170, 255, 200],
//getFillColor: [0, 120, 220, 200],
@@ -443,12 +520,15 @@ export default {
id: 'pplp',
data: `/api/project/${this.$route.params.project}/line/source?v=${this.lineTStamp?.valueOf()}`, // API endpoint returning binary data
loaders: [DougalBinaryLoader],
loadOptions: {
...this.loadOptions({
fetch: {
method: 'GET',
headers: { Accept: 'application/vnd.aaltronav.dougal+octet-stream' }
headers: {
Authorization: `Bearer ${this.$store.getters.jwt}`,
Accept: 'application/vnd.aaltronav.dougal+octet-stream'
}
}
},
}),
getRadius: 2,
getFillColor: (d, {data, index}) => data.attributes.value2.value[index] ? [240, 248, 255, 200] : [85, 170, 255, 200],
//getFillColor: [0, 120, 220, 200],

View File

@@ -1,4 +1,5 @@
<script>
import * as d3a from 'd3-array';
export default {
name: "MapTooltipsMixin",
@@ -28,6 +29,8 @@ export default {
return this.sequenceLinesTooltip(args);
} else if (args?.layer?.id == "navp") {
return this.vesselTrackPointsTooltip(args);
} else if (args?.layer?.id == "navl") {
return this.vesselTrackLinesTooltip(args);
}
},
@@ -235,7 +238,20 @@ export default {
}
},
vesselTrackLinesTooltip (args) {
const p = args.object;
console.log("track lines tooltip", p);
if (p) {
let html = `Segment ${p.num+1} / ${p.nums}<br/>\n`
html += `${p.ts0.toISOString()}<br/>\n`
html += `${p.ts1.toISOString()}<br/>\n`;
return {html, style: this.tooltipDefaultStyle};
}
},
}

View File

@@ -69,6 +69,7 @@ const allMeta = (key, value) => {
return { all: [ meta(key, value) ] };
};
//
// NOTICE These routes do not require authentication
//
@@ -103,6 +104,9 @@ app.use(mw.auth.access.user);
// Don't process the request if the data hasn't changed
app.use(mw.etag.ifNoneMatch);
// Use compression across the board
app.use(mw.compress);
// We must be authenticated before we can access these
app.map({
'/project': {
@@ -117,10 +121,12 @@ app.map({
get: [ mw.auth.access.read, mw.project.summary.get ],
},
'/project/:project/configuration': {
get: [ mw.project.configuration.get ], // Get project configuration
patch: [ mw.auth.access.edit, mw.project.configuration.patch ], // Modify project configuration
put: [ mw.auth.access.edit, mw.project.configuration.put ], // Overwrite configuration
},
'/project/:project/configuration/:path(*)?': {
get: [ mw.auth.access.read, mw.configuration.get ],
},
/*
* GIS endpoints
@@ -219,16 +225,28 @@ app.map({
'changes/:since': {
get: [ mw.auth.access.read, mw.event.changes ]
},
// TODO Rename -/:sequence → sequence/:sequence
// NOTE: old alias for /sequence/:sequence
'-/:sequence/': { // NOTE: We need to avoid conflict with the next endpoint ☹
get: [ mw.auth.access.read, mw.event.sequence.get ],
},
':id/': {
'sequence/:sequence/': {
get: [ mw.auth.access.read, mw.event.sequence.get ],
},
':id(\\d+)/': {
get: [ mw.auth.access.read, mw.event.get ],
put: [ mw.auth.access.write, mw.event.put ],
patch: [ mw.auth.access.write, mw.event.patch ],
delete: [mw.auth.access.write, mw.event.delete ]
},
'import': {
put: [ mw.auth.access.write, mw.event.import.csv, mw.event.import.put ],
post: [ mw.auth.access.write, mw.event.import.csv, mw.event.import.put ],
'/:filename': {
put: [ mw.auth.access.read, mw.event.import.csv, mw.event.import.put ],
post: [ mw.auth.access.write, mw.event.import.csv, mw.event.import.put ],
delete: [ mw.auth.access.write, mw.event.import.delete ]
},
},
},
/*
@@ -268,10 +286,6 @@ app.map({
'/project/:project/label/': {
get: [ mw.auth.access.read, mw.label.list ],
// post: [ mw.label.post ],
},
'/project/:project/configuration/:path(*)?': {
get: [ mw.auth.access.read, mw.configuration.get ],
// post: [ mw.auth.access.admin, mw.label.post ],
},
'/project/:project/info/:path(*)': {
get: [ mw.auth.operations, mw.auth.access.read, mw.info.get ],
@@ -311,6 +325,30 @@ app.map({
get: [ mw.etag.noSave, mw.gis.navdata.get ]
}
},
'/vessel/track': {
get: [ /*mw.etag.noSave,*/ mw.vessel.track.get ], // JSON array
'/line': {
get: [ // GeoJSON Feature: type = LineString
//mw.etag.noSave,
(req, res, next) => { req.query.geojson = 'LineString'; next(); },
mw.vessel.track.get
]
},
'/point': {
get: [ // GeoJSON FeatureCollection: feature types = Point
//mw.etag.noSave,
(req, res, next) => { req.query.geojson = 'Point'; next(); },
mw.vessel.track.get
]
},
'/points': {
get: [ // JSON array of (Feature: type = Point)
mw.etag.noSave,
(req, res, next) => { req.query.geojson = true; next(); },
mw.vessel.track.get
],
},
},
'/info/': {
':path(*)': {
get: [ mw.auth.operations, mw.info.get ],

View File

@@ -1,6 +1,7 @@
const { projectOrganisations, vesselOrganisations/*, orgAccess */} = require('../../../lib/db/project/organisations');
const ServerUser = require('../../../lib/db/user/User');
const { Organisations } = require('@dougal/organisations');
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
/** Second-order function.
* Returns a middleware that checks if the user has access to
@@ -14,11 +15,7 @@ function operation (operation) {
if (req.params.project) {
const projectOrgs = new Organisations(await projectOrganisations(req.params.project));
const availableOrgs = projectOrgs.accessToOperation(operation).filter(user.organisations);
console.log("Operation: ", operation);
console.log("User: ", user.name);
console.log("User orgs: ", user.organisations);
console.log("Project orgs: ", projectOrgs);
console.log("Available orgs: ", availableOrgs);
DEBUG(`operation = ${operation}, user = ${user?.name}, user orgs = %j, project orgs = %j, availableOrgs = %j`, user.organisations.toJSON(), projectOrgs.toJSON(), availableOrgs.toJSON());
if (availableOrgs.length > 0) {
next();
return;
@@ -26,16 +23,13 @@ function operation (operation) {
} else {
const vesselOrgs = new Organisations(await vesselOrganisations());
const availableOrgs = vesselOrgs.accessToOperation(operation).filter(user.organisations);
console.log("Operation: ", operation);
console.log("User: ", user.name);
console.log("User orgs: ", user.organisations);
console.log("Vessel orgs: ", vesselOrgs);
console.log("Available orgs: ", availableOrgs);
DEBUG(`operation = ${operation}, user = ${user?.name}, user orgs = %j, vessel orgs = %j, availableOrgs = %j`, user.organisations.toJSON(), vesselOrgs.toJSON(), availableOrgs.toJSON());
if (availableOrgs.length > 0) {
next();
return;
}
}
DEBUG(`Access denied to operation ${operation}.`);
next({status: 403, message: "Access denied"});
}
}

View File

@@ -1,41 +1,123 @@
const dns = require('dns');
const { Netmask } = require('netmask');
const ipaddr = require('ipaddr.js');
const { isIPv6, isIPv4 } = require('net');
const cfg = require('../../../lib/config');
const jwt = require('../../../lib/jwt');
const user = require('../../../lib/db/user');
const ServerUser = require('../../../lib/db/user/User');
const { ERROR, WARNING, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
async function authorisedIP (req, res) {
const validIPs = await user.ip({active: true}); // Get all active IP logins
validIPs.forEach( i => i.$block = new Netmask(i.ip) );
validIPs.sort( (a, b) => b.$block.bitmask - a.$block.netmask ); // More specific IPs have precedence
for (const ip of validIPs) {
const block = ip.$block;
if (block.contains(req.ip)) {
const payload = {
...ip,
ip: req.ip,
autologin: true
};
delete payload.$block;
delete payload.hash;
delete payload.active;
jwt.issue(payload, req, res);
return true;
function parseIP(ip) {
if (!ip || typeof ip !== 'string') {
WARNING('Invalid IP input:', ip);
return null;
}
// Handle comma-separated X-Forwarded-For (e.g., "87.90.254.127,")
const cleanIp = ip.split(',')[0].trim();
if (!cleanIp) {
WARNING('Empty IP after parsing:', ip);
return null;
}
// Convert IPv6-mapped IPv4 (e.g., ::ffff:127.0.0.1 -> 127.0.0.1)
if (cleanIp.startsWith('::ffff:') && isIPv4(cleanIp.split('::ffff:')[1])) {
return cleanIp.split('::ffff:')[1];
}
return cleanIp;
}
function normalizeCIDR(range) {
if (!range || typeof range !== 'string') {
WARNING('Invalid CIDR range:', range);
return null;
}
// If no /prefix, assume /32 for IPv4 or /128 for IPv6
if (!range.includes('/')) {
try {
const parsed = ipaddr.parse(range);
const prefix = parsed.kind() === 'ipv4' ? 32 : 128;
return `${range}/${prefix}`;
} catch (err) {
WARNING(`Failed to parse bare IP ${range}:`, err.message);
return null;
}
}
return range;
}
async function authorisedIP(req, res) {
const ip = parseIP(req.ip || req.headers['x-forwarded-for'] || req.headers['x-real-ip']);
DEBUG('authorisedIP:', { ip, headers: req.headers }); // Debug
if (!ip) {
WARNING('No valid IP provided:', { ip, headers: req.headers });
return false;
}
let addr;
try {
addr = ipaddr.parse(ip);
} catch (err) {
WARNING('Invalid IP:', ip, err.message);
return false;
}
const validIPs = await user.ip({ active: true }); // Get active IP logins
// Attach parsed CIDR to each IP entry
validIPs.forEach(i => {
const normalized = normalizeCIDR(i.ip);
if (!normalized) {
i.$range = null;
return;
}
try {
const [rangeAddr, prefix] = ipaddr.parseCIDR(normalized);
i.$range = { addr: rangeAddr, prefix };
} catch (err) {
WARNING(`Invalid CIDR range ${i.ip}:`, err.message);
i.$range = null; // Skip invalid ranges
}
});
// Filter out invalid ranges and sort by specificity (descending prefix length)
const validRanges = validIPs.filter(i => i.$range).sort((a, b) => b.$range.prefix - a.$range.prefix);
for (const ipEntry of validRanges) {
const { addr: rangeAddr, prefix } = ipEntry.$range;
try {
if (addr.match(rangeAddr, prefix)) {
const payload = {
...ipEntry,
ip,
autologin: true
};
delete payload.$range;
delete payload.hash;
delete payload.active;
jwt.issue(payload, req, res);
return true;
}
} catch (err) {
WARNING(`Error checking range ${ipEntry.ip}:`, err.message);
continue;
}
}
return false;
}
async function authorisedHost (req, res) {
const validHosts = await user.host({active: true}); // Get all active host logins
async function authorisedHost(req, res) {
const ip = parseIP(req.ip || req.headers['x-forwarded-for'] || req.headers['x-real-ip']);
DEBUG('authorisedHost:', { ip, headers: req.headers }); // Debug
if (!ip) {
WARNING('No valid IP for host check:', { ip, headers: req.headers });
return false;
}
const validHosts = await user.host({ active: true });
for (const key in validHosts) {
try {
const ip = await dns.promises.resolve(key);
if (ip == req.ip) {
const resolvedIPs = await dns.promises.resolve(key);
if (resolvedIPs.includes(ip)) {
const payload = {
...validHosts[key],
ip: req.ip,
ip,
autologin: true
};
delete payload.$block;
@@ -45,49 +127,28 @@ async function authorisedHost (req, res) {
return true;
}
} catch (err) {
if (err.code != "ENODATA") {
console.error(err);
if (err.code !== 'ENODATA') {
ERROR(`DNS error for host ${key}:`, err);
}
}
}
return false;
}
// TODO: Check client TLS certificates
// Probably will do this via Nginx with
// ssl_verify_client optional;
// and then putting either of the
// $ssl_client_s_dn or $ssl_client_escaped_cert
// variables into an HTTP header for Node
// to check (naturally, it must be ensured
// that a user cannot just insert the header
// in a request).
async function auth (req, res, next) {
async function auth(req, res, next) {
if (res.headersSent) {
// Nothing to do, this request must have been
// handled already by another middleware.
return;
return; // Handled by another middleware
}
// Check for a valid JWT (already decoded by a previous
// middleware).
// Check for valid JWT
if (req.user) {
if (!req.user.autologin) {
// If this is not an automatic login, check if the token is in the
// second half of its lifetime. If so, reissue a new one, valid for
// another cfg.jwt.options.expiresIn seconds.
if (req.user.exp) {
const ttl = req.user.exp - Date.now()/1000;
if (ttl < cfg.jwt.options.expiresIn/2) {
const credentials = await ServerUser.fromSQL(null, req.user.id);
if (credentials) {
// Refresh token
payload = Object.assign({}, credentials.toJSON());
jwt.issue(Object.assign({}, credentials.toJSON()), req, res);
}
if (!req.user.autologin && req.user.exp) {
const ttl = req.user.exp - Date.now() / 1000;
if (ttl < cfg.jwt.options.expiresIn / 2) {
const credentials = await ServerUser.fromSQL(null, req.user.id);
if (credentials) {
const payload = Object.assign({}, credentials.toJSON());
jwt.issue(payload, req, res);
}
}
}
@@ -95,19 +156,27 @@ async function auth (req, res, next) {
return;
}
// Check if the IP is known to us
// Check IP and host
if (await authorisedIP(req, res)) {
next();
return;
}
// Check if the hostname is known to us
if (await authorisedHost(req, res)) {
next();
return;
}
next({status: 401, message: "Not authorised"});
// If *all* else fails, check if the user came with a cookie
// (see https://gitlab.com/wgp/dougal/software/-/issues/335)
if (req.cookies.JWT) {
const token = req.cookies.JWT;
delete req.cookies.JWT;
DEBUG("falling back to cookie-based authentication");
req.user = await jwt.checkValidCredentials({jwt: token});
return await auth(req, res, next);
}
next({ status: 401, message: 'Not authorised' });
}
module.exports = auth;

View File

@@ -5,8 +5,6 @@ const cfg = require("../../../lib/config").jwt;
const getToken = function (req) {
if (req.headers.authorization && req.headers.authorization.split(' ')[0] == 'Bearer') {
return req.headers.authorization.split(' ')[1];
} else if (req.cookies.JWT) {
return req.cookies.JWT;
}
return null;
}

View File

@@ -0,0 +1,18 @@
const compression = require('compression');
const compress = compression({
level: 6, // Balance speed vs. ratio (1-9)
threshold: 512, // Compress only if response >512 bytes to avoid overhead on small bundles
filter: (req, res) => { // Ensure bundles are compressed
const accept = req.get("Accept");
if (accept.startsWith("application/vnd.aaltronav.dougal+octet-stream")) return true;
if (accept.includes("json")) return true;
if (accept.startsWith("text/")) return true;
if (accept.startsWith("model/obj")) return true;
// fallback to standard filter function
return compression.filter(req, res)
}
});
module.exports = compress;

View File

@@ -23,9 +23,9 @@ function ifNoneMatch (req, res, next) {
if (cached) {
DEBUG("ETag match. Returning cached response (ETag: %s, If-None-Match: %s) for %s %s",
cached.etag, req.get("If-None-Match"), req.method, req.url);
setHeaders(res, cached.headers);
if (req.method == "GET" || req.method == "HEAD") {
res.status(304).send();
setHeaders(res, cached.headers);
res.status(304).end();
// No next()
} else if (!isIdempotentMethod(req.method)) {
res.status(412).send();

View File

@@ -66,8 +66,18 @@ const rels = [
function invalidateCache (data, cache) {
return new Promise((resolve, reject) => {
if (!data) {
ERROR("invalidateCache called with no data");
return;
}
if (!data.payload) {
ERROR("invalidateCache called without a payload; channel = %s", data.channel);
return;
}
const channel = data.channel;
const project = data.payload.pid ?? data.payload?.new?.pid ?? data.payload?.old?.pid;
const project = data.payload?.pid ?? data.payload?.new?.pid ?? data.payload?.old?.pid;
const operation = data.payload.operation;
const table = data.payload.table;
const fields = { channel, project, operation, table };

View File

@@ -0,0 +1,146 @@
const Busboy = require('busboy');
const { parse } = require('csv-parse/sync');
async function middleware(req, res, next) {
const contentType = req.headers['content-type'] || '';
let csvText = null;
let filename = null;
if (req.params.filename && contentType.startsWith('text/csv')) {
csvText = typeof req.body === 'string' ? req.body : req.body.toString('utf8');
filename = req.params.filename;
processCsv();
} else if (contentType.startsWith('multipart/form-data')) {
const busboy = Busboy({ headers: req.headers });
let found = false;
busboy.on('file', (name, file, info) => {
if (found) {
file.resume();
return;
}
if (info.mimeType === 'text/csv') {
found = true;
filename = info.filename || 'unnamed.csv';
csvText = '';
file.setEncoding('utf8');
file.on('data', (data) => { csvText += data; });
file.on('end', () => {});
} else {
file.resume();
}
});
busboy.on('field', () => {}); // Ignore fields
busboy.on('finish', () => {
if (!found) {
return next();
}
processCsv();
});
req.pipe(busboy);
return;
} else {
return next();
}
function processCsv() {
let records;
try {
records = parse(csvText, {
relax_quotes: true,
quote: '"',
escape: '"',
skip_empty_lines: true,
trim: true
});
} catch (e) {
return res.status(400).json({ error: 'Invalid CSV' });
}
if (!records.length) {
return res.status(400).json({ error: 'Empty CSV' });
}
const headers = records[0].map(h => h.toLowerCase().trim());
const rows = records.slice(1);
let lastDate = null;
let lastTime = null;
const currentDate = new Date().toISOString().slice(0, 10);
const currentTime = new Date().toISOString().slice(11, 19);
const events = [];
for (let row of rows) {
let object = { labels: [] };
for (let k = 0; k < headers.length; k++) {
let key = headers[k];
let val = row[k] ? row[k].trim() : '';
if (!key) continue;
if (['remarks', 'event', 'comment', 'comments', 'text'].includes(key)) {
object.remarks = val;
} else if (key === 'label') {
if (val) object.labels.push(val);
} else if (key === 'labels') {
if (val) object.labels.push(...val.split(';').map(l => l.trim()).filter(l => l));
} else if (key === 'sequence' || key === 'seq') {
if (val) object.sequence = Number(val);
} else if (['point', 'shot', 'shotpoint'].includes(key)) {
if (val) object.point = Number(val);
} else if (key === 'date') {
object.date = val;
} else if (key === 'time') {
object.time = val;
} else if (key === 'timestamp') {
object.timestamp = val;
} else if (key === 'latitude') {
object.latitude = parseFloat(val);
} else if (key === 'longitude') {
object.longitude = parseFloat(val);
}
}
if (!object.remarks) continue;
let useSeqPoint = Number.isFinite(object.sequence) && Number.isFinite(object.point);
let tstamp = null;
if (!useSeqPoint) {
if (object.timestamp) {
tstamp = new Date(object.timestamp);
}
if (!tstamp || isNaN(tstamp.getTime())) {
let dateStr = object.date || lastDate || currentDate;
let timeStr = object.time || lastTime || currentTime;
if (timeStr.length === 5) timeStr += ':00';
let full = `${dateStr}T${timeStr}.000Z`;
tstamp = new Date(full);
if (isNaN(tstamp.getTime())) continue;
}
if (object.date) lastDate = object.date;
if (object.time) lastTime = object.time;
}
let event = {
remarks: object.remarks,
labels: object.labels,
meta: {
author: "*CSVImport*",
"*CSVImport*": {
filename,
tstamp: new Date().toISOString()
}
}
};
if (!isNaN(object.latitude) && !isNaN(object.longitude)) {
event.meta.geometry = {
type: "Point",
coordinates: [object.longitude, object.latitude]
};
}
if (useSeqPoint) {
event.sequence = object.sequence;
event.point = object.point;
} else if (tstamp) {
event.tstamp = tstamp.toISOString();
} else {
continue;
}
events.push(event);
}
req.body = events;
next();
}
}
module.exports = middleware;

View File

@@ -0,0 +1,18 @@
const { event } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
if (req.params.project && req.params.filename) {
await event.unimport(req.params.project, req.params.filename, req.query);
res.status(204).end();
} else {
res.status(400).send({message: "Malformed request"});
}
next();
} catch (err) {
next(err);
}
};

View File

@@ -0,0 +1,6 @@
module.exports = {
csv: require('./csv'),
put: require('./put'),
delete: require('./delete'),
}

View File

@@ -0,0 +1,16 @@
const { event } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
const payload = req.body;
await event.import(req.params.project, payload, req.query);
res.status(200).send(payload);
next();
} catch (err) {
next(err);
}
};

View File

@@ -7,5 +7,6 @@ module.exports = {
put: require('./put'),
patch: require('./patch'),
delete: require('./delete'),
changes: require('./changes')
changes: require('./changes'),
import: require('./import'),
}

View File

@@ -11,6 +11,7 @@ module.exports = {
gis: require('./gis'),
label: require('./label'),
navdata: require('./navdata'),
vessel: require('./vessel'),
queue: require('./queue'),
qc: require('./qc'),
configuration: require('./configuration'),
@@ -20,5 +21,6 @@ module.exports = {
rss: require('./rss'),
etag: require('./etag'),
version: require('./version'),
admin: require('./admin')
admin: require('./admin'),
compress: require('./compress'),
};

View File

@@ -10,7 +10,6 @@ function json (req, res, next) {
} else {
res.status(404).send({message: "Not found"});
}
next();
}
function yaml (req, res, next) {
@@ -19,7 +18,6 @@ function yaml (req, res, next) {
} else {
res.status(404).send({message: "Not found"});
}
next();
}
function csv (req, res, next) {
@@ -33,7 +31,6 @@ function csv (req, res, next) {
} else {
res.status(404).send({message: "Not found"});
}
next();
}
module.exports = async function (req, res, next) {
@@ -53,9 +50,10 @@ module.exports = async function (req, res, next) {
await handlers[mimetype](req, res, next);
} else {
res.status(406).send();
next();
}
next();
} catch (err) {
console.error(err);
next(err);
}
}

View File

@@ -8,7 +8,6 @@ async function login (req, res, next) {
if (payload) {
const token = jwt.issue(payload, req, res);
res.set("X-JWT", token);
res.set("Set-Cookie", `JWT=${token}`); // For good measure
res.status(200).send({token});
next();
return;

View File

@@ -1,6 +1,5 @@
async function logout (req, res, next) {
res.clearCookie("JWT");
res.status(204).send();
next();
}

View File

@@ -0,0 +1,4 @@
module.exports = {
track: require('./track'),
}

View File

@@ -0,0 +1,10 @@
const { gis } = require('../../../../lib/db');
module.exports = async function (req, res, next) {
try {
res.status(200).send(await gis.vesseltrack.get(req.query));
next();
} catch (err) {
next(err);
}
}

View File

@@ -0,0 +1,4 @@
module.exports = {
get: require('./get'),
}

View File

@@ -1,4 +1,3 @@
const project = require('../../lib/db/project');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectProjectConfigurationChange {
@@ -10,7 +9,7 @@ class DetectProjectConfigurationChange {
// Grab project configurations.
// NOTE that this will run asynchronously
this.run({channel: "project"}, ctx);
//this.run({channel: "project"}, ctx);
}
async run (data, ctx) {
@@ -28,13 +27,13 @@ class DetectProjectConfigurationChange {
try {
DEBUG("Project configuration change detected")
const projects = await project.get();
project.organisations.setCache(projects);
const projects = await ctx.db.project.get();
ctx.db.project.organisations.setCache(projects);
const _ctx_data = {};
for (let pid of projects.map(i => i.pid)) {
DEBUG("Retrieving configuration for", pid);
const cfg = await project.configuration.get(pid);
const cfg = await ctx.db.project.configuration.get(pid);
if (cfg?.archived === true) {
DEBUG(pid, "is archived. Ignoring");
continue;

View File

@@ -1,5 +1,3 @@
const { schema2pid } = require('../../lib/db/connection');
const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSoftStart {
@@ -33,14 +31,19 @@ class DetectSoftStart {
const prev = this.prev?.payload?.new?.meta;
// DEBUG("%j", prev);
// DEBUG("%j", cur);
DEBUG("cur.num_guns: %d\ncur.num_active: %d\nprv.num_active: %d\ntest passed: %j", cur.num_guns, cur.num_active, prev.num_active, cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns);
if (cur.lineStatus == "online" || prev.lineStatus == "online") {
DEBUG("lineStatus is online, assuming not in a soft start situation");
return;
}
DEBUG("cur.num_guns: %d\ncur.num_active: %d\nprv.num_active: %d\ncur.num_nofire: %d\nprev.num_nofire: %d", cur.num_guns, cur.num_active, prev.num_active, cur.num_nofire, prev.num_nofire);
if (cur.num_active >= 1 && !prev.num_active && cur.num_active < cur.num_guns) {
INFO("Soft start detected @", cur.tstamp);
// FIXME Shouldn't need to use schema2pid as pid already present in payload.
const projectId = await schema2pid(cur._schema ?? prev._schema);
const projectId = await ctx.schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
@@ -50,12 +53,16 @@ class DetectSoftStart {
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
if (ctx.dryRun) {
DEBUG(`DRY RUN: await ctx.db.event.post(${projectId}, ${payload});`);
} else {
await ctx.db.event.post(projectId, payload);
}
} else if (cur.num_active == cur.num_guns && prev.num_active < cur.num_active) {
} else if ((cur.num_active == cur.num_guns || (prev.num_nofire > 0 && cur.num_nofire == 0)) && prev.num_active < cur.num_active) {
INFO("Full volume detected @", cur.tstamp);
const projectId = await schema2pid(cur._schema ?? prev._schema);
const projectId = await ctx.schema2pid(cur._schema ?? prev._schema);
// TODO: Try and grab the corresponding comment from the configuration?
const payload = {
@@ -65,7 +72,11 @@ class DetectSoftStart {
meta: {auto: true, author: `*${this.constructor.name}*`}
};
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
if (ctx.dryRun) {
DEBUG(`DRY RUN: await ctx.db.event.post(${projectId}, ${payload});`);
} else {
await ctx.db.event.post(projectId, payload);
}
}
} catch (err) {

View File

@@ -1,5 +1,3 @@
const { schema2pid } = require('../../lib/db/connection');
const { event } = require('../../lib/db');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class DetectSOLEOL {
@@ -43,7 +41,7 @@ class DetectSOLEOL {
// We must use schema2pid because the pid may not have been
// populated for this event.
const projectId = await schema2pid(cur._schema ?? prev._schema);
const projectId = await ctx.schema2pid(cur._schema ?? prev._schema);
const labels = ["FSP", "FGSP"];
const remarks = `SEQ ${cur._sequence}, SOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const payload = {
@@ -55,24 +53,32 @@ class DetectSOLEOL {
meta: {auto: true, author: `*${this.constructor.name}*`}
}
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
if (ctx.dryRun) {
DEBUG(`DRY RUN: await ctx.db.event.post(${projectId}, ${payload});`);
} else {
await ctx.db.event.post(projectId, payload);
}
} else if (prev.lineName == cur.lineName && prev._sequence == cur._sequence &&
prev.lineStatus == "online" && cur.lineStatus != "online" && sequence) {
INFO("Transition to OFFLINE detected");
const projectId = await schema2pid(prev._schema ?? cur._schema);
const projectId = await ctx.schema2pid(prev._schema ?? cur._schema);
const labels = ["LSP", "LGSP"];
const remarks = `SEQ ${cur._sequence}, EOL ${cur.lineName}, BSP: ${(cur.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(cur.waterDepth).toFixed(0)} m.`;
const remarks = `SEQ ${prev._sequence}, EOL ${prev.lineName}, BSP: ${(prev.speed*3.6/1.852).toFixed(1)} kt, Water depth: ${Number(prev.waterDepth).toFixed(0)} m.`;
const payload = {
type: "sequence",
sequence,
point: cur._point,
point: prev._point,
remarks,
labels,
meta: {auto: true, author: `*${this.constructor.name}*`}
}
INFO("Posting event", projectId, payload);
await event.post(projectId, payload);
if (ctx.dryRun) {
DEBUG(`DRY RUN: await ctx.db.event.post(${projectId}, ${payload});`);
} else {
await ctx.db.event.post(projectId, payload);
}
}
} catch (err) {

View File

@@ -8,37 +8,6 @@ const Handlers = [
require('./detect-fdsp')
];
function init (ctx) {
const instances = Handlers.map(Handler => new Handler(ctx));
function prepare (data, ctx) {
const promises = [];
for (let instance of instances) {
const promise = new Promise(async (resolve, reject) => {
try {
DEBUG("Run", instance.author);
const result = await instance.run(data, ctx);
DEBUG("%s result: %O", instance.author, result);
resolve(result);
} catch (err) {
ERROR("%s error:\n%O", instance.author, err);
reject(err);
}
});
promises.push(promise);
}
return promises;
}
function despatch (data, ctx) {
return Promise.allSettled(prepare(data, ctx));
}
return { instances, prepare, despatch };
}
module.exports = {
Handlers,
init
};

View File

@@ -1,6 +1,3 @@
const { event, project } = require('../../lib/db');
const { withinValidity } = require('../../lib/utils/ranges');
const unique = require('../../lib/utils/unique');
const { ALERT, ERROR, WARNING, NOTICE, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
class ReportLineChangeTime {
@@ -44,7 +41,7 @@ class ReportLineChangeTime {
async function getLineChangeTime (data, forward = false) {
if (forward) {
const ospEvents = await event.list(projectId, {label: "FGSP"});
const ospEvents = await ctx.db.event.list(projectId, {label: "FGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp > data.tstamp).pop();
DEBUG("fsp", osp);
@@ -55,7 +52,7 @@ class ReportLineChangeTime {
return { lineChangeTime: osp.tstamp - data.tstamp, osp };
}
} else {
const ospEvents = await event.list(projectId, {label: "LGSP"});
const ospEvents = await ctx.db.event.list(projectId, {label: "LGSP"});
// DEBUG("ospEvents", ospEvents);
const osp = ospEvents.filter(i => i.tstamp < data.tstamp).shift();
DEBUG("lsp", osp);
@@ -96,16 +93,20 @@ class ReportLineChangeTime {
const opts = {jpq};
if (Array.isArray(seq)) {
opts.sequences = unique(seq).filter(i => !!i);
opts.sequences = ctx.unique(seq).filter(i => !!i);
} else {
opts.sequence = seq;
}
const staleEvents = await event.list(projectId, opts);
const staleEvents = await ctx.db.event.list(projectId, opts);
DEBUG(staleEvents.length ?? 0, "events to delete");
for (let staleEvent of staleEvents) {
DEBUG(`Deleting event id ${staleEvent.id} (seq = ${staleEvent.sequence}, point = ${staleEvent.point})`);
await event.del(projectId, staleEvent.id);
if (ctx.dryRun) {
DEBUG(`await ctx.db.event.del(${projectId}, ${staleEvent.id});`);
} else {
await ctx.db.event.del(projectId, staleEvent.id);
}
}
}
}
@@ -180,7 +181,11 @@ class ReportLineChangeTime {
const maybePostEvent = async (projectId, payload) => {
DEBUG("Posting event", projectId, payload);
await event.post(projectId, payload);
if (ctx.dryRun) {
DEBUG(`await ctx.db.event.post(${projectId}, ${payload});`);
} else {
await ctx.db.event.post(projectId, payload);
}
}
@@ -192,7 +197,7 @@ class ReportLineChangeTime {
const data = n;
DEBUG("INSERT seen: will add lct events related to ", data.id);
if (withinValidity(data.validity)) {
if (ctx.withinValidity(data.validity)) {
DEBUG("Event within validity period", data.validity, new Date());
data.tstamp = new Date(data.tstamp);

View File

@@ -1,29 +1,101 @@
const nodeAsync = require('async'); // npm install async
const { listen } = require('../lib/db/notify');
const db = require('../lib/db'); // Adjust paths; include all needed DB utils
const { schema2pid } = require('../lib/db/connection');
const unique = require('../lib/utils/unique'); // If needed by handlers
const withinValidity = require('../lib/utils/ranges').withinValidity; // If needed
const { ALERT, ERROR, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
// List of handler classes (add more as needed)
const handlerClasses = require('./handlers').Handlers;
// Channels to listen to (hardcoded for simplicity; could scan handlers for mentions)
const channels = require('../lib/db/channels');
const handlers = require('./handlers');
const { ActionsQueue } = require('../lib/queue');
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
function start () {
// Queue config: Process one at a time for order; max retries=3
const eventQueue = nodeAsync.queue(async (task, callback) => {
const { data, ctx } = task;
DEBUG(`Processing event on channel ${data.channel} with timestamp ${data._received ?? 'unknown'}`);
const queue = new ActionsQueue();
const ctx = {}; // Context object
for (const handler of ctx.handlers) {
try {
await handler.run(data, ctx);
} catch (err) {
ERROR(`Error in handler ${handler.constructor.name}:`, err);
// Retry logic: Could add task.retries++, re-enqueue if < max
}
}
const { prepare, despatch } = handlers.init(ctx);
if (typeof callback === 'function') {
// async v3.2.6+ does not use callsbacks with AsyncFunctions, but anyway
callback();
}
}, 1); // Concurrency=1 for strict order
listen(channels, function (data) {
DEBUG("Incoming data", data);
eventQueue.error((err, task) => {
ALERT(`Queue error processing task:`, err, task);
});
// We don't bother awaiting
queue.enqueue(() => despatch(data, ctx));
DEBUG("Queue size", queue.length());
// Main setup function (call from server init)
async function setupEventHandlers(projectsConfig) {
// Shared context
const ctx = {
dryRun: Boolean(process.env.DOUGAL_HANDLERS_DRY_RUN) ?? false, // If true, don't commit changes
projects: { configuration: projectsConfig }, // From user config
handlers: handlerClasses.map(Cls => new Cls()), // Instances
// DB utils (add more as needed)
db,
schema2pid,
unique,
withinValidity
// Add other utils, e.g., ctx.logger = DEBUG;
};
// Optional: Replay recent events on startup to rebuild state
// await replayRecentEvents(ctx);
// Setup listener
const subscriber = await listen(channels, (rawData) => {
const data = {
...rawData,
enqueuedAt: new Date() // For monitoring
};
eventQueue.push({ data, ctx });
});
INFO("Events manager started");
DEBUG('Event handler system initialized with channels:', channels);
if (ctx.dryRun) {
DEBUG('DRY RUNNING');
}
// Return for cleanup if needed
return {
close: () => {
subscriber.events.removeAllListeners();
subscriber.close();
eventQueue.kill();
}
};
}
module.exports = { start }
// Optional: Replay last N events to rebuild handler state (e.g., this.prev)
// async function replayRecentEvents(ctx) {
// try {
// // Example: Fetch last 10 realtime events, sorted by tstamp
// const recentRealtime = await event.listAllProjects({ channel: 'realtime', limit: 10, sort: 'tstamp DESC' });
// // Assume event.listAllProjects is a custom DB method; implement if needed
//
// // Enqueue in original order (reverse sort)
// recentRealtime.reverse().forEach((evt) => {
// const data = { channel: 'realtime', payload: { new: evt } };
// eventQueue.push({ data, ctx });
// });
//
// // Similarly for 'event' channel if needed
// DEBUG('Replayed recent events for state rebuild');
// } catch (err) {
// ERROR('Error replaying events:', err);
// }
// }
if (require.main === module) {
start();
}
module.exports = { setupEventHandlers };

View File

@@ -2,18 +2,37 @@
const { ERROR, INFO, DEBUG } = require('DOUGAL_ROOT/debug')(__filename);
async function getProjectConfigurations (opts = {}) {
const includeArchived = {includeArchived: false, ...opts};
let projectConfigurations = {};
try {
const db = require('./lib/db');
const pids = (await db.project.get())
.filter(i => includeArchived || !i.archived)
.map(i => i.pid);
for (const pid of pids) {
DEBUG(`Reading project configuration for ${pid}`);
const cfg = await db.project.configuration.get(pid);
projectConfigurations[pid] = cfg;
}
} catch (err) {
ERROR("Failed to get project configurations");
ERROR(err);
}
return projectConfigurations;
}
async function main () {
// Check that we're running against the correct database version
const version = require('./lib/version');
INFO("Running version", await version.describe());
version.compatible()
.then( (versions) => {
.then( async (versions) => {
try {
const api = require('./api');
const ws = require('./ws');
const periodicTasks = require('./periodic-tasks').init();
const { fork } = require('child_process');
const { setupEventHandlers } = require('./events');
const port = process.env.HTTP_PORT || 3000;
const host = process.env.HTTP_HOST || "127.0.0.1";
@@ -25,33 +44,31 @@ async function main () {
periodicTasks.start();
const eventManagerPath = [__dirname, "events"].join("/");
const eventManager = fork(eventManagerPath, /*{ stdio: 'ignore' }*/);
const projectConfigurations = await getProjectConfigurations();
const handlerSystem = await setupEventHandlers(projectConfigurations);
process.on("SIGINT", async () => {
DEBUG("Interrupted (SIGINT)");
eventManager.kill()
handlerSystem.close();
await periodicTasks.cleanup();
process.exit(0);
})
process.on("SIGHUP", async () => {
DEBUG("Stopping (SIGHUP)");
eventManager.kill()
handlerSystem.close();
await periodicTasks.cleanup();
process.exit(0);
})
process.on('beforeExit', async () => {
DEBUG("Preparing to exit");
eventManager.kill()
handlerSystem.close();
await periodicTasks.cleanup();
});
process.on('exit', async () => {
DEBUG("Exiting");
// eventManager.kill()
// periodicTasks.cleanup();
});
} catch (err) {
ERROR(err);

View File

@@ -0,0 +1,105 @@
const { DEBUG, ERROR } = require('DOUGAL_ROOT/debug')(__filename);
const { setSurvey, transaction } = require('../connection');
/** Remove a previous import from the database.
*
* ATTENTION!
*
* This will not just mark the events as deleted but actually
* remove them.
*/
async function bulk_unimport (projectId, filename, opts = {}) {
const client = opts.client ?? await setSurvey(projectId);
try {
const text = `
DELETE
FROM event_log
WHERE meta ? 'author'
AND meta->(meta->>'author')->>'filename' = $1;
`;
const values = [ filename ];
DEBUG("Removing all event data imported from filename '%s'", filename);
await client.query(text, values);
} catch (err) {
err.origin = __filename;
throw err;
} finally {
if (client !== opts.client) client.release();
}
return;
}
async function bulk_import (projectId, payload, opts = {}) {
const client = opts.client ?? await setSurvey(projectId);
try {
if (!payload.length) {
DEBUG("Called with no rows to be imported. Returning");
return [];
}
const filename = payload[0].meta[payload[0].meta.author].filename;
// Delete previous data from this file
await transaction.begin(client);
await bulk_unimport(projectId, filename, {client});
// Prepare arrays for each column
const tstamps = [];
const sequences = [];
const points = [];
const remarks = [];
const labels = [];
const metas = [];
for (const event of payload) {
tstamps.push(event.tstamp ? new Date(event.tstamp) : null);
sequences.push(Number.isInteger(event.sequence) ? event.sequence : null);
points.push(Number.isInteger(event.point) ? event.point : null);
remarks.push(event.remarks || '');
labels.push(Array.isArray(event.labels) && event.labels.length
? `{${event.labels.map(l => `"${l.replace(/"/g, '""')}"`).join(',')}}`
: '{}'
);
metas.push(event.meta ? JSON.stringify(event.meta) : '{}');
}
const text = `
INSERT INTO event_log (tstamp, sequence, point, remarks, labels, meta)
SELECT
UNNEST($1::TIMESTAMP[]) AS tstamp,
UNNEST($2::INTEGER[]) AS sequence,
UNNEST($3::INTEGER[]) AS point,
replace_placeholders(UNNEST($4::TEXT[]), UNNEST($1::TIMESTAMP[]), UNNEST($2::INTEGER[]), UNNEST($3::INTEGER[])) AS remarks,
UNNEST($5::TEXT[])::TEXT[] AS labels,
UNNEST($6::JSONB[]) AS meta
RETURNING id;
`;
const values = [ tstamps, sequences, points, remarks, labels, metas ];
DEBUG("Importing %d rows from filename '%s'", payload.length, filename);
const res = await client.query(text, values);
transaction.commit(client);
return res.rows.map(row => row.id);
} catch (err) {
err.origin = __filename;
throw err;
} finally {
if (client !== opts.client) client.release();
}
return;
}
module.exports = { import: bulk_import, unimport: bulk_unimport };

View File

@@ -6,5 +6,7 @@ module.exports = {
put: require('./put'),
patch: require('./patch'),
del: require('./delete'),
changes: require('./changes')
changes: require('./changes'),
import: require('./import').import,
unimport: require('./import').unimport,
}

View File

@@ -0,0 +1,37 @@
const { DEBUG, ERROR } = require('DOUGAL_ROOT/debug')(__filename);
const { setSurvey, transaction } = require('../connection');
/** Remove a previous import from the database.
*
* ATTENTION!
*
* This will not just mark the events as deleted but actually
* remove them.
*/
async function unimport (projectId, filename, opts = {}) {
const client = await setSurvey(projectId);
try {
const text = `
DELETE
FROM event_log
WHERE meta ? 'author'
AND meta->(meta->'author')->>'filename' = $1;
`;
const values = [ filename ];
DEBUG("Removing all event data imported from filename '%s'", filename);
await client.query(text, values);
} catch (err) {
err.origin = __filename;
throw err;
} finally {
client.release();
}
return;
}
module.exports = post;

View File

@@ -1,6 +1,7 @@
module.exports = {
project: require('./project'),
navdata: require('./navdata')
navdata: require('./navdata'),
vesseltrack: require('./vesseltrack'),
// line: require('./line')
};

View File

@@ -7,7 +7,10 @@ async function lines (options = {}) {
const client = await pool.connect();
const text = `
SELECT ST_AsGeoJSON(ST_MakeLine(geometry)) geojson
SELECT json_build_object(
'type', 'Feature',
'geometry', ST_AsGeoJSON(ST_MakeLine(geometry))::json
) geojson
FROM (
SELECT geometry
FROM real_time_inputs

View File

@@ -0,0 +1,191 @@
const { pool } = require('../../connection');
const { project } = require('../../utils');
async function get(options = {}) {
/*
* ts0: earliest timestamp (default: NOW - 7 days)
* ts1: latest timestamp (if null, assume NOW)
* di: decimation interval (return every di-th record, if null: no decimation)
* l: limit (return no more than l records, default: 1000, max: 1,000,000)
* geojson: 'LineString' (GeoJSON LineString Feature), 'Point' (GeoJSON FeatureCollection),
* truthy (array of Point features), falsy (array of {x, y, tstamp, meta})
*/
let { l, di, ts1, ts0, geojson, projection } = {
l: 1000,
di: null,
ts1: null,
ts0: null,
geojson: false,
...options
};
// Input validation and sanitization
l = Math.max(1, Math.min(parseInt(l) || 1000, 1000000)); // Enforce 1 <= l <= 1,000,000
di = di != null ? Math.max(1, parseInt(di)) : null; // Ensure di is positive integer or null
ts0 = ts0 ? new Date(ts0).toISOString() : new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString(); // Default: 7 days ago
ts1 = ts1 ? new Date(ts1).toISOString() : null; // Convert to ISO string or null
geojson = geojson === 'LineString' || geojson === 'Point' ? geojson : !!geojson; // Normalize geojson
const client = await pool.connect();
// Build the WHERE clause and values array dynamically
let whereClauses = [];
let values = [];
let paramIndex = 1;
if (ts0) {
whereClauses.push(`tstamp >= $${paramIndex++}`);
values.push(ts0);
}
if (ts1) {
whereClauses.push(`tstamp <= $${paramIndex++}`);
values.push(ts1);
}
// Add limit to values
values.push(l);
const limitClause = `LIMIT $${paramIndex++}`;
// Base query with conditional geometry selection
let queryText = `
SELECT
tstamp,
CASE
WHEN meta->>'lineStatus' = 'offline' THEN
ARRAY[COALESCE((meta->>'longitude')::float, ST_X(geometry)), COALESCE((meta->>'latitude')::float, ST_Y(geometry))]
ELSE
ARRAY[
COALESCE(
(meta->>'longitudeMaster')::float,
ST_X(geometry)
),
COALESCE(
(meta->>'latitudeMaster')::float,
ST_Y(geometry)
)
]
END AS coordinates,
meta::json AS meta
FROM public.real_time_inputs
${whereClauses.length ? 'WHERE ' + whereClauses.join(' AND ') : ''}
ORDER BY tstamp DESC
${limitClause}
`;
// If decimation is requested, wrap the query in a subquery with ROW_NUMBER
if (di != null && di > 1) {
values.push(di);
queryText = `
SELECT tstamp, coordinates, meta
FROM (
SELECT
tstamp,
CASE
WHEN meta->>'lineStatus' = 'offline' THEN
ARRAY[COALESCE((meta->>'longitude')::float, ST_X(geometry)), COALESCE((meta->>'latitude')::float, ST_Y(geometry))]
ELSE
ARRAY[
COALESCE(
(meta->>'longitudeMaster')::float,
ST_X(geometry)
),
COALESCE(
(meta->>'latitudeMaster')::float,
ST_Y(geometry)
)
]
END AS coordinates,
meta::json AS meta,
ROW_NUMBER() OVER (ORDER BY tstamp DESC) AS rn
FROM public.real_time_inputs
${whereClauses.length ? 'WHERE ' + whereClauses.join(' AND ') : ''}
) sub
WHERE rn % $${paramIndex} = 0
ORDER BY tstamp DESC
${limitClause}
`;
}
try {
const res = await client.query(queryText, values);
if (!res.rows?.length) {
throw { status: 204 }; // No Content
}
// Process rows: Convert tstamp to Date and extract coordinates
let processed = res.rows.map(row => ({
tstamp: new Date(row.tstamp),
x: row.coordinates[0], // Longitude
y: row.coordinates[1], // Latitude
meta: projection != null
? projection == ""
? undefined
: project([row.meta], projection)[0] : row.meta
}));
// Handle geojson output formats
if (geojson === 'LineString') {
// Compute line length (haversine formula in JavaScript for simplicity)
let length = 0;
for (let i = 1; i < processed.length; i++) {
const p1 = processed[i - 1];
const p2 = processed[i];
const R = 6371e3; // Earth's radius in meters
const φ1 = p1.y * Math.PI / 180;
const φ2 = p2.y * Math.PI / 180;
const Δφ = (p2.y - p1.y) * Math.PI / 180;
const Δλ = (p2.x - p1.x) * Math.PI / 180;
const a = Math.sin(Δφ / 2) ** 2 +
Math.cos(φ1) * Math.cos(φ2) * Math.sin(Δλ / 2) ** 2;
const c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1 - a));
length += R * c;
}
return {
type: 'Feature',
geometry: {
type: 'LineString',
coordinates: processed.map(p => [p.x, p.y])
},
properties: {
ts0: processed[processed.length - 1].tstamp.toISOString(),
ts1: processed[0].tstamp.toISOString(),
distance: length
}
};
} else if (geojson === 'Point') {
return {
type: 'FeatureCollection',
features: processed.map(p => ({
type: 'Feature',
geometry: {
type: 'Point',
coordinates: [p.x, p.y]
},
properties: {
...p.meta,
tstamp: p.tstamp.toISOString()
}
}))
};
} else if (geojson) {
return processed.map(p => ({
type: 'Feature',
geometry: {
type: 'Point',
coordinates: [p.x, p.y]
},
properties: {
...p.meta,
tstamp: p.tstamp.toISOString()
}
}));
} else {
return processed;
}
} finally {
client.release();
}
}
module.exports = get;

View File

@@ -0,0 +1,5 @@
module.exports = {
get: require('./get')
};

View File

@@ -0,0 +1,94 @@
const { exec } = require('child_process');
const util = require('util');
const fs = require('fs');
const execPromise = util.promisify(exec);
const { setSurvey, pool } = require('../connection');
async function createProject(pid, name, src_srid, dst_srid, src_schema, dst_schema) {
const client = await pool.connect();
try {
await client.query('BEGIN');
// Determine default src_schema and src_srid
let src_schema_val;
let src_srid_val;
const res = await client.query(`
SELECT schema, meta->>'epsg' as epsg
FROM public.projects
ORDER BY CAST(SUBSTRING(schema FROM 8) AS INTEGER) DESC
LIMIT 1
`);
if (res.rows.length === 0) {
src_schema_val = 'survey_0';
src_srid_val = 23031;
} else {
src_schema_val = res.rows[0].schema;
src_srid_val = parseInt(res.rows[0].epsg, 10);
}
// Apply parameters or defaults
src_schema = src_schema || src_schema_val;
src_srid = src_srid ?? src_srid_val;
dst_srid = dst_srid ?? src_srid;
if (dst_schema === undefined) {
const srcNum = parseInt(src_schema.replace('survey_', ''), 10);
dst_schema = `survey_${srcNum + 1}`;
}
// Dump the source schema structure
const pgDumpCmd = `PGPASSWORD=${pool.options.password} pg_dump --schema-only --schema=${src_schema} --host=${pool.options.host} --port=${pool.options.port} --username=${pool.options.user} ${pool.options.database}`;
const { stdout: sqlDump } = await execPromise(pgDumpCmd);
//fs.writeFileSync('sqlDump.sql', sqlDump);
//console.log('Saved original SQL to sqlDump.sql');
// Modify the dump to use the destination schema and update SRID
const escapedSrcSchema = src_schema.replace(/[-/\\^$*+?.()|[\]{}]/g, '\\$&');
let modifiedSql = sqlDump
.replace(new RegExp(`CREATE SCHEMA ${escapedSrcSchema};`, 'gi'), `CREATE SCHEMA ${dst_schema};`)
.replace(new RegExp(`ALTER SCHEMA ${escapedSrcSchema} OWNER TO`, 'gi'), `ALTER SCHEMA ${dst_schema} OWNER TO`)
.replace(/SELECT pg_catalog\.set_config\('search_path',\s*'', false\);/, `SELECT pg_catalog.set_config('search_path', '${dst_schema}, public', false);`)
.replace(new RegExp(`${escapedSrcSchema}\\.`, 'g'), `${dst_schema}.`);
// Replace SRID in the SQL dump if src_srid !== dst_srid
if (src_srid !== dst_srid) {
// Replace SRID in geometry column definitions (e.g., geometry(Point, 23031))
modifiedSql = modifiedSql.replace(
new RegExp(`geometry\\((\\w+),\\s*${src_srid}\\s*\\)`, 'g'),
`geometry($1, ${dst_srid})`
);
// Replace SRID in AddGeometryColumn calls (if used in the dump)
modifiedSql = modifiedSql.replace(
new RegExp(`AddGeometryColumn\\((['"]?)${escapedSrcSchema}\\1,\\s*(['"]\\w+['"]),\\s*(['"]\\w+['"]),\\s*${src_srid},`, 'g'),
`AddGeometryColumn($1${dst_schema}$1, $2, $3, ${dst_srid},`
);
console.log(`Replaced SRID ${src_srid} with ${dst_srid} in SQL dump`);
}
//fs.writeFileSync('modifiedSql.sql', modifiedSql);
//console.log('Saved modified SQL to modifiedSql.sql');
// Execute the modified SQL to create the cloned schema
await client.query(modifiedSql);
console.log('Applied modified SQL successfully');
// Insert the new project into public.projects
const meta = { epsg: dst_srid.toString() }; // Ensure string for JSONB
await client.query(`
INSERT INTO public.projects (pid, name, schema, meta)
VALUES ($1, $2, $3, $4)
`, [pid, name, dst_schema, meta]);
console.log(`Inserted project ${pid} into public.projects with schema ${dst_schema}`);
await client.query('COMMIT');
console.log('Transaction committed successfully');
} catch (error) {
await client.query('ROLLBACK');
console.error('Transaction rolled back due to error:', error);
throw error;
} finally {
client.release();
console.log('Database client released');
}
}
module.exports = createProject;

View File

@@ -46,10 +46,6 @@ function issue (payload, req, res) {
if (token) {
res.set("X-JWT", token);
const expiry = payload.exp ? (new Date(payload.exp*1000)).toUTCString() : null;
const cookie = expiry
? `JWT=${token}; path=/; SameSite=lax; expires=${expiry}`
: `JWT=${token}; path=/; SameSite=lax`
res.set("Set-Cookie", cookie); // For good measure
}
}

View File

@@ -1,52 +0,0 @@
const Queue = require('./queue');
// Inspired by:
// https://stackoverflow.com/questions/53540348/js-async-await-tasks-queue#53540586
class ActionsQueue extends Queue {
constructor (items = []) {
super(items);
this.pending = false;
}
enqueue (action) {
return new Promise ((resolve, reject) => {
super.enqueue({ action, resolve, reject });
this.dequeue();
});
}
async dequeue () {
if (this.pending) {
return false;
}
const item = super.dequeue();
if (!item) {
return false;
}
try {
this.pending = true;
const result = await item.action(this);
this.pending = false;
item.resolve(result);
} catch (err) {
this.pending = false;
item.reject(err);
} finally {
this.dequeue();
}
}
}
module.exports = ActionsQueue;

View File

@@ -1,6 +0,0 @@
module.exports = {
Queue: require('./queue'),
ActionsQueue: require('./actions-queue')
};

View File

@@ -1,22 +0,0 @@
class Queue {
constructor (items = []) {
this.items = items;
}
enqueue (item) {
this.items.push(item);
}
dequeue () {
return this.items.shift();
}
length () {
return this.items.length;
}
}
module.exports = Queue;

View File

@@ -29,18 +29,21 @@
"@dougal/binary": "file:../../modules/@dougal/binary",
"@dougal/organisations": "file:../../modules/@dougal/organisations",
"@dougal/user": "file:../../modules/@dougal/user",
"async": "^3.2.6",
"body-parser": "gitlab:aaltronav/contrib/expressjs/body-parser",
"busboy": "^1.6.0",
"compression": "^1.8.1",
"cookie-parser": "^1.4.5",
"csv": "^6.3.3",
"d3": "^6.7.0",
"debug": "^4.3.4",
"express": "^4.17.1",
"express-jwt": "^8.4.1",
"ipaddr.js": "^1.9.1",
"json2csv": "^5.0.6",
"jsonwebtoken": "^9.0.2",
"leaflet-headless": "git+https://git@gitlab.com/aaltronav/contrib/leaflet-headless.git#devel",
"marked": "^4.0.12",
"netmask": "^2.0.2",
"node-fetch": "^2.6.1",
"nunjucks": "^3.2.3",
"path-to-regexp": "^6.2.1",

View File

@@ -20,8 +20,10 @@ function start (server, pingInterval=30000) {
const exp = decoded?.exp;
if (exp) {
const timeout = (exp*1000 - Date.now()) / 2;
socket._jwtRefresh = setTimeout(() => refreshJwt(token), timeout);
console.log(`Scheduled JWT refresh in ${timeout/1000} seconds at time ${(new Date(Date.now() + timeout)).toISOString()}`);
if (!socket._jwtRefresh) {
socket._jwtRefresh = setTimeout(() => refreshJwt(token), timeout);
console.log(`Scheduled JWT refresh in ${timeout/1000} seconds at time ${(new Date(Date.now() + timeout)).toISOString()}`);
}
} else {
console.log("Token has no exp claim. Refresh not scheduled");
}
@@ -76,8 +78,8 @@ function start (server, pingInterval=30000) {
});
socket.on('close', () => {
if (socket._jwtTimeout) {
clearTimeout(socket._jwtTimeout);
if (socket._jwtRefresh) {
clearTimeout(socket._jwtRefresh);
}
});
});

206
package-lock.json generated
View File

@@ -39,10 +39,12 @@
"dependencies": {
"@deck.gl/aggregation-layers": "^9.1.13",
"@deck.gl/geo-layers": "^9.1.13",
"@deck.gl/mesh-layers": "^9.1.14",
"@dougal/binary": "file:../../../modules/@dougal/binary",
"@dougal/concurrency": "file:../../../modules/@dougal/concurrency",
"@dougal/organisations": "file:../../../modules/@dougal/organisations",
"@dougal/user": "file:../../../modules/@dougal/user",
"@loaders.gl/obj": "^4.3.4",
"@mdi/font": "^7.2.96",
"buffer": "^6.0.3",
"core-js": "^3.6.5",
@@ -3368,14 +3370,6 @@
"dev": true,
"license": "MIT"
},
"lib/www/client/source/node_modules/bytes": {
"version": "3.0.0",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"lib/www/client/source/node_modules/call-bind": {
"version": "1.0.5",
"dev": true,
@@ -3709,47 +3703,6 @@
"dev": true,
"license": "MIT"
},
"lib/www/client/source/node_modules/compressible": {
"version": "2.0.18",
"dev": true,
"license": "MIT",
"dependencies": {
"mime-db": ">= 1.43.0 < 2"
},
"engines": {
"node": ">= 0.6"
}
},
"lib/www/client/source/node_modules/compression": {
"version": "1.7.4",
"dev": true,
"license": "MIT",
"dependencies": {
"accepts": "~1.3.5",
"bytes": "3.0.0",
"compressible": "~2.0.16",
"debug": "2.6.9",
"on-headers": "~1.0.2",
"safe-buffer": "5.1.2",
"vary": "~1.1.2"
},
"engines": {
"node": ">= 0.8.0"
}
},
"lib/www/client/source/node_modules/compression/node_modules/debug": {
"version": "2.6.9",
"dev": true,
"license": "MIT",
"dependencies": {
"ms": "2.0.0"
}
},
"lib/www/client/source/node_modules/compression/node_modules/ms": {
"version": "2.0.0",
"dev": true,
"license": "MIT"
},
"lib/www/client/source/node_modules/connect-history-api-fallback": {
"version": "2.0.0",
"dev": true,
@@ -5413,14 +5366,6 @@
"node": ">= 0.10"
}
},
"lib/www/client/source/node_modules/ipaddr.js": {
"version": "2.1.0",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 10"
}
},
"lib/www/client/source/node_modules/is-arrayish": {
"version": "0.2.1",
"dev": true,
@@ -6495,14 +6440,6 @@
"dev": true,
"license": "MIT"
},
"lib/www/client/source/node_modules/on-headers": {
"version": "1.0.2",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"lib/www/client/source/node_modules/onetime": {
"version": "5.1.2",
"dev": true,
@@ -8641,14 +8578,6 @@
"spdx-expression-parse": "^3.0.0"
}
},
"lib/www/client/source/node_modules/vary": {
"version": "1.1.2",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"lib/www/client/source/node_modules/vue": {
"version": "2.6.14",
"license": "MIT"
@@ -9430,18 +9359,21 @@
"@dougal/binary": "file:../../modules/@dougal/binary",
"@dougal/organisations": "file:../../modules/@dougal/organisations",
"@dougal/user": "file:../../modules/@dougal/user",
"async": "^3.2.6",
"body-parser": "gitlab:aaltronav/contrib/expressjs/body-parser",
"busboy": "^1.6.0",
"compression": "^1.8.1",
"cookie-parser": "^1.4.5",
"csv": "^6.3.3",
"d3": "^6.7.0",
"debug": "^4.3.4",
"express": "^4.17.1",
"express-jwt": "^8.4.1",
"ipaddr.js": "^1.9.1",
"json2csv": "^5.0.6",
"jsonwebtoken": "^9.0.2",
"leaflet-headless": "git+https://git@gitlab.com/aaltronav/contrib/leaflet-headless.git#devel",
"marked": "^4.0.12",
"netmask": "^2.0.2",
"node-fetch": "^2.6.1",
"nunjucks": "^3.2.3",
"path-to-regexp": "^6.2.1",
@@ -10242,13 +10174,6 @@
"node": ">= 0.6"
}
},
"lib/www/server/node_modules/netmask": {
"version": "2.0.2",
"license": "MIT",
"engines": {
"node": ">= 0.4.0"
}
},
"lib/www/server/node_modules/nunjucks": {
"version": "3.2.4",
"license": "BSD-2-Clause",
@@ -13365,13 +13290,6 @@
"node": ">= 0.4.0"
}
},
"lib/www/server/node_modules/vary": {
"version": "1.1.2",
"license": "MIT",
"engines": {
"node": ">= 0.8"
}
},
"lib/www/server/node_modules/xtend": {
"version": "4.0.2",
"license": "MIT",
@@ -13493,19 +13411,18 @@
}
},
"node_modules/@deck.gl/mesh-layers": {
"version": "9.1.13",
"resolved": "https://registry.npmjs.org/@deck.gl/mesh-layers/-/mesh-layers-9.1.13.tgz",
"integrity": "sha512-ujhe9FtB4qRRCXH/hY5p+IQ5VO/AC+/dtern6CTzYzjGnUnAvsbIgBZ3jxSlb1B/D3wlVE778W2cmv7MIToJJg==",
"peer": true,
"version": "9.1.14",
"resolved": "https://registry.npmjs.org/@deck.gl/mesh-layers/-/mesh-layers-9.1.14.tgz",
"integrity": "sha512-NVUw0yG4stJfrklWCGP9j8bNlf9YQc4PccMeNNIHNrU/Je6/Va6dJZg0RGtVkeaTY1Lk3A7wRzq8/M5Urfvuiw==",
"dependencies": {
"@loaders.gl/gltf": "^4.2.0",
"@luma.gl/gltf": "^9.1.5",
"@luma.gl/shadertools": "^9.1.5"
"@luma.gl/gltf": "~9.1.9",
"@luma.gl/shadertools": "~9.1.9"
},
"peerDependencies": {
"@deck.gl/core": "^9.1.0",
"@luma.gl/core": "^9.1.5",
"@luma.gl/engine": "^9.1.5"
"@luma.gl/core": "~9.1.9",
"@luma.gl/engine": "~9.1.9"
}
},
"node_modules/@dougal/binary": {
@@ -13700,6 +13617,18 @@
"@loaders.gl/core": "^4.3.0"
}
},
"node_modules/@loaders.gl/obj": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/@loaders.gl/obj/-/obj-4.3.4.tgz",
"integrity": "sha512-Rdn+NHjLI0jKYrKNicJuQJohnHh7QAv4szCji8eafYYMrVtSIonNozBXUfe/c4V7HL/FVvvHCkfC66rvLvayaQ==",
"dependencies": {
"@loaders.gl/loader-utils": "4.3.4",
"@loaders.gl/schema": "4.3.4"
},
"peerDependencies": {
"@loaders.gl/core": "^4.3.0"
}
},
"node_modules/@loaders.gl/schema": {
"version": "4.3.4",
"resolved": "https://registry.npmjs.org/@loaders.gl/schema/-/schema-4.3.4.tgz",
@@ -14243,6 +14172,11 @@
"node": ">=0.8"
}
},
"node_modules/async": {
"version": "3.2.6",
"resolved": "https://registry.npmjs.org/async/-/async-3.2.6.tgz",
"integrity": "sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA=="
},
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
@@ -14347,6 +14281,17 @@
"node": ">=0.10.0"
}
},
"node_modules/busboy": {
"version": "1.6.0",
"resolved": "https://registry.npmjs.org/busboy/-/busboy-1.6.0.tgz",
"integrity": "sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==",
"dependencies": {
"streamsearch": "^1.1.0"
},
"engines": {
"node": ">=10.16.0"
}
},
"node_modules/bytes": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
@@ -14444,6 +14389,34 @@
"node": ">= 10"
}
},
"node_modules/compressible": {
"version": "2.0.18",
"resolved": "https://registry.npmjs.org/compressible/-/compressible-2.0.18.tgz",
"integrity": "sha512-AF3r7P5dWxL8MxyITRMlORQNaOA2IkAFaTr4k7BUumjPtRpGDTZpl0Pb1XCO6JeDCBdp126Cgs9sMxqSjgYyRg==",
"dependencies": {
"mime-db": ">= 1.43.0 < 2"
},
"engines": {
"node": ">= 0.6"
}
},
"node_modules/compression": {
"version": "1.8.1",
"resolved": "https://registry.npmjs.org/compression/-/compression-1.8.1.tgz",
"integrity": "sha512-9mAqGPHLakhCLeNyxPkK4xVo746zQ/czLH1Ky+vkitMnWfWZps8r0qXuwhwizagCRttsL4lfG4pIOvaWLpAP0w==",
"dependencies": {
"bytes": "3.1.2",
"compressible": "~2.0.18",
"debug": "2.6.9",
"negotiator": "~0.6.4",
"on-headers": "~1.1.0",
"safe-buffer": "5.2.1",
"vary": "~1.1.2"
},
"engines": {
"node": ">= 0.8.0"
}
},
"node_modules/concat-map": {
"version": "0.0.1",
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
@@ -15615,6 +15588,15 @@
"node": ">=12"
}
},
"node_modules/ipaddr.js": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-2.2.0.tgz",
"integrity": "sha512-Ag3wB2o37wslZS19hZqorUnrnzSkpOVy+IiiDEiTqNubEYpYuHWIf6K4psgN2ZWKExS4xhVCrRVfb/wfW8fWJA==",
"dev": true,
"engines": {
"node": ">= 10"
}
},
"node_modules/is-buffer": {
"version": "1.1.6",
"resolved": "https://registry.npmjs.org/is-buffer/-/is-buffer-1.1.6.tgz",
@@ -15983,6 +15965,14 @@
"resolved": "https://registry.npmjs.org/nan/-/nan-2.23.0.tgz",
"integrity": "sha512-1UxuyYGdoQHcGg87Lkqm3FzefucTa0NAiOcuRsDmysep3c1LVCRK2krrUDafMWtjSG04htvAmvg96+SDknOmgQ=="
},
"node_modules/negotiator": {
"version": "0.6.4",
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.4.tgz",
"integrity": "sha512-myRT3DiWPHqho5PrJaIRyaMv2kgYf0mUVgBNOYMuCH5Ki1yEiQaf/ZJuQ62nvpc44wL5WDbTX7yGJi1Neevw8w==",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/node-fetch": {
"version": "2.7.0",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz",
@@ -16090,6 +16080,14 @@
"node": ">= 0.8"
}
},
"node_modules/on-headers": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/on-headers/-/on-headers-1.1.0.tgz",
"integrity": "sha512-737ZY3yNnXy37FHkQxPzt4UZ2UWPWiCZWLvFZ4fu5cueciegX0zGPnrlY6bwRg4FdQOe9YU8MkmJwGhoMybl8A==",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/once": {
"version": "1.4.0",
"resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
@@ -16524,6 +16522,14 @@
"node": ">= 0.8"
}
},
"node_modules/streamsearch": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/streamsearch/-/streamsearch-1.1.0.tgz",
"integrity": "sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==",
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/string_decoder": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
@@ -16713,6 +16719,14 @@
"uuid": "bin/uuid"
}
},
"node_modules/vary": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz",
"integrity": "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==",
"engines": {
"node": ">= 0.8"
}
},
"node_modules/verror": {
"version": "1.10.0",
"resolved": "https://registry.npmjs.org/verror/-/verror-1.10.0.tgz",

42
sbin/get-ip.sh Executable file
View File

@@ -0,0 +1,42 @@
#!/bin/bash
# Function to get the internet IP
get_internet_ip() {
INTERFACE=$(ip route get 8.8.8.8 2>/dev/null | awk '{for(i=1;i<=NF;i++) if ($i=="dev") {print $(i+1); exit}}')
if [ -z "$INTERFACE" ] || [ "$INTERFACE" = "lo" ]; then
return 1
fi
IP=$(ip -4 addr show dev "$INTERFACE" scope global | grep -oP '(?<=inet\s)\d{1,3}(\.\d{1,3}){3}(?=/)' | head -n1)
if [ -z "$IP" ]; then
return 1
fi
echo "$IP"
return 0
}
# Get all global non-loopback IPv4 addresses
ALL_IPS=$(ip -4 addr show scope global | grep -oP '(?<=inet\s)\d{1,3}(\.\d{1,3}){3}(?=/)')
ARG="${1:-local}"
if [ "$ARG" = "internet" ]; then
INTERNET_IP=$(get_internet_ip)
if [ $? -ne 0 ]; then
echo "No valid default route or global IPv4 address found."
exit 1
fi
echo "$INTERNET_IP"
else
# For "local" or anything else
INTERNET_IP=$(get_internet_ip) || INTERNET_IP="" # If fails, set to empty so no exclusion
for IP in $ALL_IPS; do
if [ "$IP" != "$INTERNET_IP" ]; then
echo "$IP"
fi
done
fi

40
sbin/update-dns.sh Executable file
View File

@@ -0,0 +1,40 @@
#!/bin/bash
# Path to Nginx configuration file
NGINX_CONFIG="/etc/nginx/vhosts.d/dougal.conf"
# Extract the first hostname matching 'lan.dougal' from the config
# Assumes server_name lines like: server_name hostname1 hostname2;
HOSTNAME=$(grep -oE '[a-zA-Z0-9.-]*lan\.dougal[a-zA-Z0-9.-]*' "$NGINX_CONFIG" | head -n 1)
if [ -z "$HOSTNAME" ]; then
echo "Error: No matching hostname found in $NGINX_CONFIG"
exit 1
fi
# Path to IP retrieval script
IP_SCRIPT="$HOME/software/sbin/get-ip.sh"
# Get the current IPv4 address
IP_ADDRESS=$("$IP_SCRIPT" |head -n1)
if [ -z "$IP_ADDRESS" ]; then
echo "Error: Failed to retrieve IP address from $IP_SCRIPT"
exit 1
fi
# Check for DYNDNS_PASSWD environment variable
if [ -z "$DYNDNS_PASSWD" ]; then
echo "Error: DYNDNS_PASSWD environment variable is not set"
exit 1
fi
# Hurricane Electric DynDNS update URL
UPDATE_URL="https://dyn.dns.he.net/nic/update?hostname=$HOSTNAME&password=$DYNDNS_PASSWD&myip=$IP_ADDRESS"
# Send the update request and capture the response
RESPONSE=$(curl -s "$UPDATE_URL")
# Output the response for logging/debugging
echo "Update response for $HOSTNAME ($IP_ADDRESS): $RESPONSE"