Gateways, domains, and new service interface (#3001)

* add support for inbound proxies

* backend changes

* fix file type

* proxy -> tunnel, implement backend apis

* wip start-tunneld

* add domains and gateways, remove routers, fix docs links

* dont show hidden actions

* show and test dns

* edit instead of chnage acme and change gateway

* refactor: domains page

* refactor: gateways page

* domains and acme refactor

* certificate authorities

* refactor public/private gateways

* fix fe types

* domains mostly finished

* refactor: add file control to form service

* add ip util to sdk

* domains api + migration

* start service interface page, WIP

* different options for clearnet domains

* refactor: styles for interfaces page

* minor

* better placeholder for no addresses

* start sorting addresses

* best address logic

* comments

* fix unnecessary export

* MVP of service interface page

* domains preferred

* fix: address comments

* only translations left

* wip: start-tunnel & fix build

* forms for adding domain, rework things based on new ideas

* fix: dns testing

* public domain, max width, descriptions for dns

* nix StartOS domains, implement public and private domains at interface scope

* restart tor instead of reset

* better icon for restart tor

* dns

* fix sort functions for public and private domains

* with todos

* update types

* clean up tech debt, bump dependencies

* revert to ts-rs v9

* fix all types

* fix dns form

* add missing translations

* it builds

* fix: comments (#3009)

* fix: comments

* undo default

---------

Co-authored-by: Matt Hill <mattnine@protonmail.com>

* fix: refactor legacy components (#3010)

* fix: comments

* fix: refactor legacy components

* remove default again

---------

Co-authored-by: Matt Hill <mattnine@protonmail.com>

* more translations

* wip

* fix deadlock

* coukd work

* simple renaming

* placeholder for empty service interfaces table

* honor hidden form values

* remove logs

* reason instead of description

* fix dns

* misc fixes

* implement toggling gateways for service interface

* fix showing dns records

* move status column in service list

* remove unnecessary truthy check

* refactor: refactor forms components and remove legacy Taiga UI package (#3012)

* handle wh file uploads

* wip: debugging tor

* socks5 proxy working

* refactor: fix multiple comments (#3013)

* refactor: fix multiple comments

* styling changes, add documentation to sidebar

* translations for dns page

* refactor: subtle colors

* rearrange service page

---------

Co-authored-by: Matt Hill <mattnine@protonmail.com>

* fix file_stream and remove non-terminating test

* clean  up logs

* support for sccache

* fix gha sccache

* more marketplace translations

* install wizard clarity

* stub hostnameInfo in migration

* fix address info after setup, fix styling on SI page, new 040 release notes

* remove tor logs from os

* misc fixes

* reset tor still not functioning...

* update ts

* minor styling and wording

* chore: some fixes (#3015)

* fix gateway renames

* different handling for public domains

* styling fixes

* whole navbar should not be clickable on service show page

* timeout getState request

* remove links from changelog

* misc fixes from pairing

* use custom name for gateway in more places

* fix dns parsing

* closes #3003

* closes #2999

* chore: some fixes (#3017)

* small copy change

* revert hardcoded error for testing

* dont require port forward if gateway is public

* use old wan ip when not available

* fix .const hanging on undefined

* fix test

* fix doc test

* fix renames

* update deps

* allow specifying dependency metadata directly

* temporarily make dependencies not cliackable in marketplace listings

* fix socks bind

* fix test

---------

Co-authored-by: Aiden McClelland <me@drbonez.dev>
Co-authored-by: waterplea <alexander@inkin.ru>
This commit is contained in:
Matt Hill
2025-09-09 21:43:51 -06:00
committed by GitHub
parent 1cc9a1a30b
commit add01ebc68
537 changed files with 19940 additions and 20551 deletions

View File

@@ -3,18 +3,18 @@ use std::path::PathBuf;
use clap::Parser;
use itertools::Itertools;
use rpc_toolkit::{from_fn_async, Context, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
use crate::context::CliContext;
use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::signer::sign::AnyVerifyingKey;
use crate::registry::signer::{ContactInfo, SignerInfo};
use crate::registry::RegistryDatabase;
use crate::registry::context::RegistryContext;
use crate::registry::signer::{ContactInfo, SignerInfo};
use crate::rpc_continuations::Guid;
use crate::util::serde::{display_serializable, HandlerExtSerde, WithIoFormat};
use crate::sign::AnyVerifyingKey;
use crate::util::serde::{HandlerExtSerde, WithIoFormat, display_serializable};
pub fn admin_api<C: Context>() -> ParentHandler<C> {
ParentHandler::new()

View File

@@ -11,13 +11,13 @@ use url::Url;
use crate::prelude::*;
use crate::progress::PhaseProgressTrackerHandle;
use crate::registry::signer::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::registry::signer::commitment::{Commitment, Digestable};
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey};
use crate::registry::signer::AcceptSigners;
use crate::s9pk::S9pk;
use crate::s9pk::merkle_archive::source::http::HttpSource;
use crate::s9pk::merkle_archive::source::{ArchiveSource, Section};
use crate::s9pk::S9pk;
use crate::sign::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::sign::commitment::{Commitment, Digestable};
use crate::sign::{AnySignature, AnyVerifyingKey};
use crate::upload::UploadingFile;
#[derive(Debug, Deserialize, Serialize, TS)]

View File

@@ -1,222 +0,0 @@
use std::collections::BTreeMap;
use std::sync::Arc;
use std::time::{Duration, Instant, SystemTime, UNIX_EPOCH};
use axum::body::Body;
use axum::extract::Request;
use axum::response::Response;
use chrono::Utc;
use http::HeaderValue;
use rpc_toolkit::yajrc::RpcError;
use rpc_toolkit::{Middleware, RpcRequest, RpcResponse};
use serde::{Deserialize, Serialize};
use tokio::io::AsyncWriteExt;
use tokio::sync::Mutex;
use ts_rs::TS;
use url::Url;
use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::signer::commitment::request::RequestCommitment;
use crate::registry::signer::commitment::Commitment;
use crate::registry::signer::sign::{
AnySignature, AnySigningKey, AnyVerifyingKey, SignatureScheme,
};
use crate::util::serde::Base64;
pub const AUTH_SIG_HEADER: &str = "X-StartOS-Registry-Auth-Sig";
#[derive(Deserialize)]
pub struct Metadata {
#[serde(default)]
admin: bool,
#[serde(default)]
get_signer: bool,
}
#[derive(Clone)]
pub struct Auth {
nonce_cache: Arc<Mutex<BTreeMap<Instant, u64>>>, // for replay protection
signer: Option<Result<AnyVerifyingKey, RpcError>>,
}
impl Auth {
pub fn new() -> Self {
Self {
nonce_cache: Arc::new(Mutex::new(BTreeMap::new())),
signer: None,
}
}
async fn handle_nonce(&mut self, nonce: u64) -> Result<(), Error> {
let mut cache = self.nonce_cache.lock().await;
if cache.values().any(|n| *n == nonce) {
return Err(Error::new(
eyre!("replay attack detected"),
ErrorKind::Authorization,
));
}
while let Some(entry) = cache.first_entry() {
if entry.key().elapsed() > Duration::from_secs(60) {
entry.remove_entry();
} else {
break;
}
}
Ok(())
}
}
#[derive(Serialize, Deserialize, TS)]
pub struct RegistryAdminLogRecord {
pub timestamp: String,
pub name: String,
#[ts(type = "{ id: string | number | null; method: string; params: any }")]
pub request: RpcRequest,
pub key: AnyVerifyingKey,
}
pub struct SignatureHeader {
pub commitment: RequestCommitment,
pub signer: AnyVerifyingKey,
pub signature: AnySignature,
}
impl SignatureHeader {
pub fn to_header(&self) -> HeaderValue {
let mut url: Url = "http://localhost".parse().unwrap();
self.commitment.append_query(&mut url);
url.query_pairs_mut()
.append_pair("signer", &self.signer.to_string());
url.query_pairs_mut()
.append_pair("signature", &self.signature.to_string());
HeaderValue::from_str(url.query().unwrap_or_default()).unwrap()
}
pub fn from_header(header: &HeaderValue) -> Result<Self, Error> {
let query: BTreeMap<_, _> = form_urlencoded::parse(header.as_bytes()).collect();
Ok(Self {
commitment: RequestCommitment::from_query(&header)?,
signer: query.get("signer").or_not_found("signer")?.parse()?,
signature: query.get("signature").or_not_found("signature")?.parse()?,
})
}
pub fn sign(signer: &AnySigningKey, body: &[u8], context: &str) -> Result<Self, Error> {
let timestamp = SystemTime::now()
.duration_since(UNIX_EPOCH)
.map(|d| d.as_secs() as i64)
.unwrap_or_else(|e| e.duration().as_secs() as i64 * -1);
let nonce = rand::random();
let commitment = RequestCommitment {
timestamp,
nonce,
size: body.len() as u64,
blake3: Base64(*blake3::hash(body).as_bytes()),
};
let signature = signer
.scheme()
.sign_commitment(&signer, &commitment, context)?;
Ok(Self {
commitment,
signer: signer.verifying_key(),
signature,
})
}
}
impl Middleware<RegistryContext> for Auth {
type Metadata = Metadata;
async fn process_http_request(
&mut self,
ctx: &RegistryContext,
request: &mut Request,
) -> Result<(), Response> {
if request.headers().contains_key(AUTH_SIG_HEADER) {
self.signer = Some(
async {
let SignatureHeader {
commitment,
signer,
signature,
} = SignatureHeader::from_header(
request
.headers()
.get(AUTH_SIG_HEADER)
.or_not_found("missing X-StartOS-Registry-Auth-Sig")
.with_kind(ErrorKind::InvalidRequest)?,
)?;
signer.scheme().verify_commitment(
&signer,
&commitment,
&ctx.hostname,
&signature,
)?;
let now = SystemTime::now()
.duration_since(UNIX_EPOCH)
.map(|d| d.as_secs() as i64)
.unwrap_or_else(|e| e.duration().as_secs() as i64 * -1);
if (now - commitment.timestamp).abs() > 30 {
return Err(Error::new(
eyre!("timestamp not within 30s of now"),
ErrorKind::InvalidSignature,
));
}
self.handle_nonce(commitment.nonce).await?;
let mut body = Vec::with_capacity(commitment.size as usize);
commitment.copy_to(request, &mut body).await?;
*request.body_mut() = Body::from(body);
Ok(signer)
}
.await
.map_err(RpcError::from),
);
}
Ok(())
}
async fn process_rpc_request(
&mut self,
ctx: &RegistryContext,
metadata: Self::Metadata,
request: &mut RpcRequest,
) -> Result<(), RpcResponse> {
async move {
let signer = self.signer.take().transpose()?;
if metadata.get_signer {
if let Some(signer) = &signer {
request.params["__auth_signer"] = to_value(signer)?;
}
}
if metadata.admin {
let signer = signer
.ok_or_else(|| Error::new(eyre!("UNAUTHORIZED"), ErrorKind::Authorization))?;
let db = ctx.db.peek().await;
let (guid, admin) = db.as_index().as_signers().get_signer_info(&signer)?;
if db.into_admins().de()?.contains(&guid) {
let mut log = tokio::fs::OpenOptions::new()
.create(true)
.append(true)
.open(ctx.datadir.join("admin.log"))
.await?;
log.write_all(
(serde_json::to_string(&RegistryAdminLogRecord {
timestamp: Utc::now().to_rfc3339(),
name: admin.name,
request: request.clone(),
key: signer,
})
.with_kind(ErrorKind::Serialization)?
+ "\n")
.as_bytes(),
)
.await?;
} else {
return Err(Error::new(eyre!("UNAUTHORIZED"), ErrorKind::Authorization));
}
}
Ok(())
}
.await
.map_err(|e| RpcResponse::from_result(Err(e)))
}
}

View File

@@ -3,26 +3,33 @@ use std::ops::Deref;
use std::path::{Path, PathBuf};
use std::sync::Arc;
use chrono::Utc;
use clap::Parser;
use imbl_value::InternedString;
use patch_db::PatchDb;
use reqwest::{Client, Proxy};
use rpc_toolkit::yajrc::RpcError;
use rpc_toolkit::{CallRemote, Context, Empty};
use rpc_toolkit::{CallRemote, Context, Empty, RpcRequest};
use serde::{Deserialize, Serialize};
use sqlx::PgPool;
use tokio::sync::broadcast::Sender;
use tracing::instrument;
use ts_rs::TS;
use url::Url;
use crate::context::config::{ContextConfig, CONFIG_PATH};
use crate::context::config::{CONFIG_PATH, ContextConfig};
use crate::context::{CliContext, RpcContext};
use crate::middleware::signature::SignatureAuthContext;
use crate::prelude::*;
use crate::registry::auth::{SignatureHeader, AUTH_SIG_HEADER};
use crate::registry::device_info::{DeviceInfo, DEVICE_INFO_HEADER};
use crate::registry::signer::sign::AnySigningKey;
use crate::registry::RegistryDatabase;
use crate::registry::device_info::{DEVICE_INFO_HEADER, DeviceInfo};
use crate::registry::signer::SignerInfo;
use crate::rpc_continuations::RpcContinuations;
use crate::sign::AnyVerifyingKey;
use crate::util::io::append_file;
const DEFAULT_REGISTRY_LISTEN: SocketAddr =
SocketAddr::new(std::net::IpAddr::V4(Ipv4Addr::LOCALHOST), 5959);
#[derive(Debug, Clone, Default, Deserialize, Serialize, Parser)]
#[serde(rename_all = "kebab-case")]
@@ -31,9 +38,9 @@ pub struct RegistryConfig {
#[arg(short = 'c', long = "config")]
pub config: Option<PathBuf>,
#[arg(short = 'l', long = "listen")]
pub listen: Option<SocketAddr>,
#[arg(short = 'h', long = "hostname")]
pub hostname: Option<InternedString>,
pub registry_listen: Option<SocketAddr>,
#[arg(short = 'H', long = "hostname")]
pub registry_hostname: Vec<InternedString>,
#[arg(short = 'p', long = "tor-proxy")]
pub tor_proxy: Option<Url>,
#[arg(short = 'd', long = "datadir")]
@@ -45,9 +52,9 @@ impl ContextConfig for RegistryConfig {
fn next(&mut self) -> Option<PathBuf> {
self.config.take()
}
fn merge_with(&mut self, other: Self) {
self.listen = self.listen.take().or(other.listen);
self.hostname = self.hostname.take().or(other.hostname);
fn merge_with(&mut self, mut other: Self) {
self.registry_listen = self.registry_listen.take().or(other.registry_listen);
self.registry_hostname.append(&mut other.registry_hostname);
self.tor_proxy = self.tor_proxy.take().or(other.tor_proxy);
self.datadir = self.datadir.take().or(other.datadir);
}
@@ -63,7 +70,7 @@ impl RegistryConfig {
}
pub struct RegistryContextSeed {
pub hostname: InternedString,
pub hostnames: Vec<InternedString>,
pub listen: SocketAddr,
pub db: TypedPatchDb<RegistryDatabase>,
pub datadir: PathBuf,
@@ -105,20 +112,15 @@ impl RegistryContext {
},
None => None,
};
if config.registry_hostname.is_empty() {
return Err(Error::new(
eyre!("missing required configuration: registry-hostname"),
ErrorKind::NotFound,
));
}
Ok(Self(Arc::new(RegistryContextSeed {
hostname: config
.hostname
.as_ref()
.ok_or_else(|| {
Error::new(
eyre!("missing required configuration: hostname"),
ErrorKind::NotFound,
)
})?
.clone(),
listen: config
.listen
.unwrap_or(SocketAddr::new(Ipv4Addr::LOCALHOST.into(), 5959)),
hostnames: config.registry_hostname.clone(),
listen: config.registry_listen.unwrap_or(DEFAULT_REGISTRY_LISTEN),
db,
datadir,
rpc_continuations: RpcContinuations::new(),
@@ -163,64 +165,28 @@ impl CallRemote<RegistryContext> for CliContext {
params: Value,
_: Empty,
) -> Result<Value, RpcError> {
use reqwest::header::{ACCEPT, CONTENT_LENGTH, CONTENT_TYPE};
use reqwest::Method;
use rpc_toolkit::yajrc::{GenericRpcMethod, Id, RpcRequest};
use rpc_toolkit::RpcResponse;
let url = self
.registry_url
.clone()
.ok_or_else(|| Error::new(eyre!("`--registry` required"), ErrorKind::InvalidRequest))?;
method = method.strip_prefix("registry.").unwrap_or(method);
let rpc_req = RpcRequest {
id: Some(Id::Number(0.into())),
method: GenericRpcMethod::<_, _, Value>::new(method),
params,
};
let body = serde_json::to_vec(&rpc_req)?;
let host = url.host().or_not_found("registry hostname")?.to_string();
let mut req = self
.client
.request(Method::POST, url)
.header(CONTENT_TYPE, "application/json")
.header(ACCEPT, "application/json")
.header(CONTENT_LENGTH, body.len());
if let Ok(key) = self.developer_key() {
req = req.header(
AUTH_SIG_HEADER,
SignatureHeader::sign(&AnySigningKey::Ed25519(key.clone()), &body, &host)?
.to_header(),
let url = if let Some(url) = self.registry_url.clone() {
url
} else if self.registry_hostname.is_some() {
format!(
"http://{}",
self.registry_listen.unwrap_or(DEFAULT_REGISTRY_LISTEN)
)
.parse()
.map_err(Error::from)?
} else {
return Err(
Error::new(eyre!("`--registry` required"), ErrorKind::InvalidRequest).into(),
);
}
let res = req.body(body).send().await?;
};
method = method.strip_prefix("registry.").unwrap_or(method);
let sig_context = self
.registry_hostname
.clone()
.or(url.host().as_ref().map(InternedString::from_display))
.or_not_found("registry hostname")?;
if !res.status().is_success() {
let status = res.status();
let txt = res.text().await?;
let mut res = Err(Error::new(
eyre!("{}", status.canonical_reason().unwrap_or(status.as_str())),
ErrorKind::Network,
));
if !txt.is_empty() {
res = res.with_ctx(|_| (ErrorKind::Network, txt));
}
return res.map_err(From::from);
}
match res
.headers()
.get(CONTENT_TYPE)
.and_then(|v| v.to_str().ok())
{
Some("application/json") => {
serde_json::from_slice::<RpcResponse>(&*res.bytes().await?)
.with_kind(ErrorKind::Deserialization)?
.result
}
_ => Err(Error::new(eyre!("unknown content type"), ErrorKind::Network).into()),
}
crate::middleware::signature::call_remote(self, url, &sig_context, method, params).await
}
}
@@ -231,10 +197,10 @@ impl CallRemote<RegistryContext, RegistryUrlParams> for RpcContext {
params: Value,
RegistryUrlParams { registry }: RegistryUrlParams,
) -> Result<Value, RpcError> {
use reqwest::header::{ACCEPT, CONTENT_LENGTH, CONTENT_TYPE};
use reqwest::Method;
use rpc_toolkit::yajrc::{GenericRpcMethod, Id, RpcRequest};
use reqwest::header::{ACCEPT, CONTENT_LENGTH, CONTENT_TYPE};
use rpc_toolkit::RpcResponse;
use rpc_toolkit::yajrc::{GenericRpcMethod, Id, RpcRequest};
let url = registry.join("rpc/v0")?;
method = method.strip_prefix("registry.").unwrap_or(method);
@@ -286,3 +252,72 @@ impl CallRemote<RegistryContext, RegistryUrlParams> for RpcContext {
}
}
}
#[derive(Deserialize)]
pub struct RegistryAuthMetadata {
#[serde(default)]
admin: bool,
}
#[derive(Serialize, Deserialize, TS)]
pub struct AdminLogRecord {
pub timestamp: String,
pub name: String,
#[ts(type = "{ id: string | number | null; method: string; params: any }")]
pub request: RpcRequest,
pub key: AnyVerifyingKey,
}
impl SignatureAuthContext for RegistryContext {
type Database = RegistryDatabase;
type AdditionalMetadata = RegistryAuthMetadata;
type CheckPubkeyRes = Option<(AnyVerifyingKey, SignerInfo)>;
fn db(&self) -> &TypedPatchDb<Self::Database> {
&self.db
}
async fn sig_context(
&self,
) -> impl IntoIterator<Item = Result<impl AsRef<str> + Send, Error>> + Send {
self.hostnames.iter().map(Ok)
}
fn check_pubkey(
db: &Model<Self::Database>,
pubkey: Option<&AnyVerifyingKey>,
metadata: Self::AdditionalMetadata,
) -> Result<Self::CheckPubkeyRes, Error> {
if metadata.admin {
if let Some(pubkey) = pubkey {
let (guid, admin) = db.as_index().as_signers().get_signer_info(pubkey)?;
if db.as_admins().de()?.contains(&guid) {
return Ok(Some((pubkey.clone(), admin)));
}
}
Err(Error::new(eyre!("UNAUTHORIZED"), ErrorKind::Authorization))
} else {
Ok(None)
}
}
async fn post_auth_hook(
&self,
check_pubkey_res: Self::CheckPubkeyRes,
request: &RpcRequest,
) -> Result<(), Error> {
use tokio::io::AsyncWriteExt;
if let Some((pubkey, admin)) = check_pubkey_res {
let mut log = append_file(self.datadir.join("admin.log")).await?;
log.write_all(
(serde_json::to_string(&AdminLogRecord {
timestamp: Utc::now().to_rfc3339(),
name: admin.name,
request: request.clone(),
key: pubkey,
})
.with_kind(ErrorKind::Serialization)?
+ "\n")
.as_bytes(),
)
.await?;
}
Ok(())
}
}

View File

@@ -2,19 +2,19 @@ use std::path::PathBuf;
use clap::Parser;
use itertools::Itertools;
use patch_db::json_ptr::{JsonPointer, ROOT};
use patch_db::Dump;
use patch_db::json_ptr::{JsonPointer, ROOT};
use rpc_toolkit::yajrc::RpcError;
use rpc_toolkit::{from_fn_async, Context, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use tracing::instrument;
use ts_rs::TS;
use crate::context::CliContext;
use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::RegistryDatabase;
use crate::util::serde::{apply_expr, HandlerExtSerde};
use crate::registry::context::RegistryContext;
use crate::util::serde::{HandlerExtSerde, apply_expr};
pub fn db_api<C: Context>() -> ParentHandler<C> {
ParentHandler::new()

View File

@@ -15,8 +15,8 @@ use url::Url;
use crate::context::RpcContext;
use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::util::lshw::{LshwDevice, LshwDisplay, LshwProcessor};
use crate::util::VersionString;
use crate::util::lshw::{LshwDevice, LshwDisplay, LshwProcessor};
use crate::version::VersionT;
pub const DEVICE_INFO_HEADER: &str = "X-StartOS-Device-Info";

View File

@@ -5,7 +5,7 @@ use clap::Parser;
use imbl_value::InternedString;
use itertools::Itertools;
use models::DataUrl;
use rpc_toolkit::{from_fn_async, Context, Empty, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, Empty, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
@@ -51,7 +51,6 @@ pub fn info_api<C: Context>() -> ParentHandler<C, WithIoFormat<Empty>> {
pub struct RegistryInfo {
pub name: Option<String>,
pub icon: Option<DataUrl<'static>>,
#[ts(as = "BTreeMap::<String, Category>")]
pub categories: BTreeMap<InternedString, Category>,
}

View File

@@ -3,16 +3,16 @@ use std::collections::{BTreeMap, BTreeSet};
use axum::Router;
use futures::future::ready;
use models::DataUrl;
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler, Server};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, Server, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
use crate::context::CliContext;
use crate::middleware::cors::Cors;
use crate::middleware::signature::SignatureAuth;
use crate::net::static_server::{bad_request, not_found, server_error};
use crate::net::web_server::{Accept, WebServer};
use crate::prelude::*;
use crate::registry::auth::Auth;
use crate::registry::context::RegistryContext;
use crate::registry::device_info::DeviceInfoMiddleware;
use crate::registry::os::index::OsIndex;
@@ -23,7 +23,6 @@ use crate::util::serde::HandlerExtSerde;
pub mod admin;
pub mod asset;
pub mod auth;
pub mod context;
pub mod db;
pub mod device_info;
@@ -95,7 +94,7 @@ pub fn registry_router(ctx: RegistryContext) -> Router {
any(
Server::new(move || ready(Ok(ctx.clone())), registry_api())
.middleware(Cors::new())
.middleware(Auth::new())
.middleware(SignatureAuth::new())
.middleware(DeviceInfoMiddleware::new()),
)
})

View File

@@ -7,7 +7,7 @@ use clap::Parser;
use exver::Version;
use imbl_value::InternedString;
use itertools::Itertools;
use rpc_toolkit::{from_fn_async, Context, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
use url::Url;
@@ -17,15 +17,15 @@ use crate::prelude::*;
use crate::progress::{FullProgressTracker, ProgressUnits};
use crate::registry::asset::RegistryAsset;
use crate::registry::context::RegistryContext;
use crate::registry::os::index::OsVersionInfo;
use crate::registry::os::SIG_CONTEXT;
use crate::registry::signer::commitment::blake3::Blake3Commitment;
use crate::registry::signer::sign::ed25519::Ed25519;
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::registry::os::index::OsVersionInfo;
use crate::s9pk::merkle_archive::hash::VerifyingWriter;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::s9pk::merkle_archive::source::http::HttpSource;
use crate::s9pk::merkle_archive::source::multi_cursor_file::MultiCursorFile;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::sign::commitment::blake3::Blake3Commitment;
use crate::sign::ed25519::Ed25519;
use crate::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::util::io::open_file;
use crate::util::serde::Base64;
@@ -101,10 +101,10 @@ async fn add_asset(
commitment,
}: AddAssetParams,
accessor: impl FnOnce(
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
) -> Result<(), Error> {
signer
.scheme()
@@ -207,7 +207,7 @@ pub async fn cli_add_asset(
return Err(Error::new(
eyre!("Unknown extension"),
ErrorKind::InvalidRequest,
))
));
}
};
@@ -302,10 +302,10 @@ async fn remove_asset(
signer,
}: RemoveAssetParams,
accessor: impl FnOnce(
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
) -> Result<(), Error> {
ctx.db
.mutate(|db| {

View File

@@ -5,9 +5,9 @@ use std::path::{Path, PathBuf};
use clap::Parser;
use exver::Version;
use helpers::AtomicFile;
use imbl_value::{json, InternedString};
use imbl_value::{InternedString, json};
use itertools::Itertools;
use rpc_toolkit::{from_fn_async, Context, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
@@ -16,11 +16,11 @@ use crate::prelude::*;
use crate::progress::{FullProgressTracker, ProgressUnits};
use crate::registry::asset::RegistryAsset;
use crate::registry::context::RegistryContext;
use crate::registry::os::index::OsVersionInfo;
use crate::registry::os::SIG_CONTEXT;
use crate::registry::signer::commitment::blake3::Blake3Commitment;
use crate::registry::signer::commitment::Commitment;
use crate::registry::os::index::OsVersionInfo;
use crate::s9pk::merkle_archive::source::multi_cursor_file::MultiCursorFile;
use crate::sign::commitment::Commitment;
use crate::sign::commitment::blake3::Blake3Commitment;
use crate::util::io::open_file;
pub fn get_api<C: Context>() -> ParentHandler<C> {
@@ -62,10 +62,10 @@ async fn get_os_asset(
ctx: RegistryContext,
GetOsAssetParams { version, platform }: GetOsAssetParams,
accessor: impl FnOnce(
&Model<OsVersionInfo>,
) -> &Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
&Model<OsVersionInfo>,
) -> &Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
) -> Result<RegistryAsset<Blake3Commitment>, Error> {
accessor(
ctx.db

View File

@@ -1,4 +1,4 @@
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
pub mod add;
pub mod get;

View File

@@ -6,7 +6,7 @@ use clap::Parser;
use exver::Version;
use imbl_value::InternedString;
use itertools::Itertools;
use rpc_toolkit::{from_fn_async, Context, HandlerArgs, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerArgs, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
@@ -15,13 +15,13 @@ use crate::prelude::*;
use crate::progress::FullProgressTracker;
use crate::registry::asset::RegistryAsset;
use crate::registry::context::RegistryContext;
use crate::registry::os::index::OsVersionInfo;
use crate::registry::os::SIG_CONTEXT;
use crate::registry::signer::commitment::blake3::Blake3Commitment;
use crate::registry::signer::sign::ed25519::Ed25519;
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::s9pk::merkle_archive::source::multi_cursor_file::MultiCursorFile;
use crate::registry::os::index::OsVersionInfo;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::s9pk::merkle_archive::source::multi_cursor_file::MultiCursorFile;
use crate::sign::commitment::blake3::Blake3Commitment;
use crate::sign::ed25519::Ed25519;
use crate::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::util::io::open_file;
use crate::util::serde::Base64;
@@ -70,10 +70,10 @@ async fn sign_asset(
signature,
}: SignAssetParams,
accessor: impl FnOnce(
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
&mut Model<OsVersionInfo>,
) -> &mut Model<BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>>
+ UnwindSafe
+ Send,
) -> Result<(), Error> {
ctx.db
.mutate(|db| {
@@ -165,7 +165,7 @@ pub async fn cli_sign_asset(
return Err(Error::new(
eyre!("Unknown extension"),
ErrorKind::InvalidRequest,
))
));
}
};

View File

@@ -8,8 +8,8 @@ use ts_rs::TS;
use crate::prelude::*;
use crate::registry::asset::RegistryAsset;
use crate::registry::context::RegistryContext;
use crate::registry::signer::commitment::blake3::Blake3Commitment;
use crate::rpc_continuations::Guid;
use crate::sign::commitment::blake3::Blake3Commitment;
#[derive(Debug, Default, Deserialize, Serialize, HasModel, TS)]
#[serde(rename_all = "camelCase")]
@@ -44,11 +44,8 @@ pub struct OsVersionInfo {
#[ts(type = "string")]
pub source_version: VersionRange,
pub authorized: BTreeSet<Guid>,
#[ts(as = "BTreeMap::<String, RegistryAsset::<Blake3Commitment>>")]
pub iso: BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>, // platform (i.e. x86_64-nonfree) -> asset
#[ts(as = "BTreeMap::<String, RegistryAsset::<Blake3Commitment>>")]
pub squashfs: BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>, // platform (i.e. x86_64-nonfree) -> asset
#[ts(as = "BTreeMap::<String, RegistryAsset::<Blake3Commitment>>")]
pub img: BTreeMap<InternedString, RegistryAsset<Blake3Commitment>>, // platform (i.e. raspberrypi) -> asset
}

View File

@@ -1,4 +1,4 @@
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use crate::context::CliContext;
use crate::util::serde::HandlerExtSerde;

View File

@@ -1,13 +1,12 @@
use std::collections::BTreeMap;
use chrono::Utc;
use chrono::{DateTime, NaiveDate, NaiveDateTime, Utc};
use clap::Parser;
use exver::{Version, VersionRange};
use imbl_value::InternedString;
use itertools::Itertools;
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use sqlx::query;
use ts_rs::TS;
use crate::context::CliContext;
@@ -15,8 +14,8 @@ use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::device_info::DeviceInfo;
use crate::registry::os::index::OsVersionInfo;
use crate::registry::signer::sign::AnyVerifyingKey;
use crate::util::serde::{display_serializable, HandlerExtSerde, WithIoFormat};
use crate::sign::AnyVerifyingKey;
use crate::util::serde::{HandlerExtSerde, WithIoFormat, display_serializable};
pub mod signer;
@@ -151,6 +150,33 @@ pub struct GetOsVersionParams {
pub device_info: Option<DeviceInfo>,
}
struct PgDateTime(DateTime<Utc>);
impl sqlx::Type<sqlx::Postgres> for PgDateTime {
fn type_info() -> <sqlx::Postgres as sqlx::Database>::TypeInfo {
sqlx::postgres::PgTypeInfo::with_oid(sqlx::postgres::types::Oid(1184))
}
}
impl sqlx::Encode<'_, sqlx::Postgres> for PgDateTime {
fn encode_by_ref(
&self,
buf: &mut <sqlx::Postgres as sqlx::Database>::ArgumentBuffer<'_>,
) -> Result<sqlx::encode::IsNull, sqlx::error::BoxDynError> {
fn postgres_epoch_datetime() -> NaiveDateTime {
NaiveDate::from_ymd_opt(2000, 1, 1)
.expect("expected 2000-01-01 to be a valid NaiveDate")
.and_hms_opt(0, 0, 0)
.expect("expected 2000-01-01T00:00:00 to be a valid NaiveDateTime")
}
let micros = (self.0.naive_utc() - postgres_epoch_datetime())
.num_microseconds()
.ok_or_else(|| format!("NaiveDateTime out of range for Postgres: {:?}", self.0))?;
micros.encode(buf)
}
fn size_hint(&self) -> usize {
std::mem::size_of::<i64>()
}
}
pub async fn get_version(
ctx: RegistryContext,
GetOsVersionParams {
@@ -166,14 +192,13 @@ pub async fn get_version(
if let (Some(pool), Some(server_id), Some(arch)) = (&ctx.pool, server_id, &platform) {
let created_at = Utc::now();
query!(
"INSERT INTO user_activity (created_at, server_id, arch) VALUES ($1, $2, $3)",
created_at,
server_id,
&**arch
)
.execute(pool)
.await?;
sqlx::query("INSERT INTO user_activity (created_at, server_id, arch) VALUES ($1, $2, $3)")
.bind(PgDateTime(created_at))
.bind(server_id)
.bind(&**arch)
.execute(pool)
.await
.with_kind(ErrorKind::Database)?;
}
let target = target.unwrap_or(VersionRange::Any);
ctx.db

View File

@@ -2,7 +2,7 @@ use std::collections::BTreeMap;
use clap::Parser;
use exver::Version;
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;

View File

@@ -15,13 +15,13 @@ use crate::prelude::*;
use crate::progress::{FullProgressTracker, ProgressTrackerWriter, ProgressUnits};
use crate::registry::context::RegistryContext;
use crate::registry::package::index::PackageVersionInfo;
use crate::registry::signer::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::registry::signer::sign::ed25519::Ed25519;
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::s9pk::merkle_archive::source::http::HttpSource;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::s9pk::v2::SIG_CONTEXT;
use crate::s9pk::S9pk;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::s9pk::merkle_archive::source::http::HttpSource;
use crate::s9pk::v2::SIG_CONTEXT;
use crate::sign::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::sign::ed25519::Ed25519;
use crate::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
use crate::util::io::TrackingIO;
#[derive(Debug, Deserialize, Serialize, TS)]

View File

@@ -3,7 +3,7 @@ use std::collections::BTreeMap;
use clap::Parser;
use imbl_value::InternedString;
use models::PackageId;
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;
@@ -11,7 +11,7 @@ use crate::context::CliContext;
use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::package::index::Category;
use crate::util::serde::{display_serializable, HandlerExtSerde, WithIoFormat};
use crate::util::serde::{HandlerExtSerde, WithIoFormat, display_serializable};
pub fn category_api<C: Context>() -> ParentHandler<C> {
ParentHandler::new()

View File

@@ -12,8 +12,8 @@ use crate::prelude::*;
use crate::registry::context::RegistryContext;
use crate::registry::device_info::DeviceInfo;
use crate::registry::package::index::{PackageIndex, PackageVersionInfo};
use crate::util::serde::{display_serializable, WithIoFormat};
use crate::util::VersionString;
use crate::util::serde::{WithIoFormat, display_serializable};
#[derive(
Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord, Deserialize, Serialize, TS, ValueEnum,

View File

@@ -12,20 +12,19 @@ use crate::prelude::*;
use crate::registry::asset::RegistryAsset;
use crate::registry::context::RegistryContext;
use crate::registry::device_info::DeviceInfo;
use crate::registry::signer::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey};
use crate::rpc_continuations::Guid;
use crate::s9pk::S9pk;
use crate::s9pk::git_hash::GitHash;
use crate::s9pk::manifest::{Alerts, Description, HardwareRequirements};
use crate::s9pk::merkle_archive::source::FileSource;
use crate::s9pk::S9pk;
use crate::sign::commitment::merkle_archive::MerkleArchiveCommitment;
use crate::sign::{AnySignature, AnyVerifyingKey};
#[derive(Debug, Default, Deserialize, Serialize, HasModel, TS)]
#[serde(rename_all = "camelCase")]
#[model = "Model<Self>"]
#[ts(export)]
pub struct PackageIndex {
#[ts(as = "BTreeMap::<String, Category>")]
pub categories: BTreeMap<InternedString, Category>,
pub packages: BTreeMap<PackageId, PackageInfo>,
}

View File

@@ -1,4 +1,4 @@
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use crate::context::CliContext;
use crate::prelude::*;

View File

@@ -2,7 +2,7 @@ use std::collections::BTreeMap;
use clap::Parser;
use models::PackageId;
use rpc_toolkit::{from_fn_async, Context, HandlerExt, ParentHandler};
use rpc_toolkit::{Context, HandlerExt, ParentHandler, from_fn_async};
use serde::{Deserialize, Serialize};
use ts_rs::TS;

View File

@@ -9,11 +9,8 @@ use ts_rs::TS;
use url::Url;
use crate::prelude::*;
use crate::registry::signer::commitment::Digestable;
use crate::registry::signer::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
pub mod commitment;
pub mod sign;
use crate::sign::commitment::Digestable;
use crate::sign::{AnySignature, AnyVerifyingKey, SignatureScheme};
#[derive(Debug, Deserialize, Serialize, HasModel, TS)]
#[serde(rename_all = "camelCase")]

View File

@@ -1,50 +0,0 @@
use blake3::Hash;
use digest::Update;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncWrite;
use ts_rs::TS;
use crate::prelude::*;
use crate::registry::signer::commitment::{Commitment, Digestable};
use crate::s9pk::merkle_archive::hash::VerifyingWriter;
use crate::s9pk::merkle_archive::source::ArchiveSource;
use crate::util::io::{ParallelBlake3Writer, TrackingIO};
use crate::util::serde::Base64;
use crate::CAP_10_MiB;
#[derive(Clone, Debug, Deserialize, Serialize, HasModel, PartialEq, Eq, TS)]
#[serde(rename_all = "camelCase")]
#[model = "Model<Self>"]
#[ts(export)]
pub struct Blake3Commitment {
pub hash: Base64<[u8; 32]>,
#[ts(type = "number")]
pub size: u64,
}
impl Digestable for Blake3Commitment {
fn update<D: Update>(&self, digest: &mut D) {
digest.update(&*self.hash);
digest.update(&u64::to_be_bytes(self.size));
}
}
impl<'a, Resource: ArchiveSource> Commitment<&'a Resource> for Blake3Commitment {
async fn create(resource: &'a Resource) -> Result<Self, Error> {
let mut hasher = TrackingIO::new(0, ParallelBlake3Writer::new(CAP_10_MiB));
resource.copy_all_to(&mut hasher).await?;
Ok(Self {
size: hasher.position(),
hash: Base64(*hasher.into_inner().finalize().await?.as_bytes()),
})
}
async fn copy_to<W: AsyncWrite + Unpin + Send>(
&self,
resource: &'a Resource,
writer: W,
) -> Result<(), Error> {
let mut hasher =
VerifyingWriter::new(writer, Some((Hash::from_bytes(*self.hash), self.size)));
resource.copy_to(0, self.size, &mut hasher).await?;
hasher.verify().await?;
Ok(())
}
}

View File

@@ -1,127 +0,0 @@
use digest::Update;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncWrite;
use ts_rs::TS;
use crate::prelude::*;
use crate::registry::signer::commitment::{Commitment, Digestable};
use crate::s9pk::merkle_archive::source::FileSource;
use crate::s9pk::merkle_archive::MerkleArchive;
use crate::s9pk::S9pk;
use crate::util::io::TrackingIO;
use crate::util::serde::Base64;
#[derive(Debug, Deserialize, Serialize, HasModel, TS)]
#[serde(rename_all = "camelCase")]
#[model = "Model<Self>"]
#[ts(export)]
pub struct MerkleArchiveCommitment {
pub root_sighash: Base64<[u8; 32]>,
#[ts(type = "number")]
pub root_maxsize: u64,
}
impl MerkleArchiveCommitment {
pub fn from_query(query: &str) -> Result<Option<Self>, Error> {
let mut root_sighash = None;
let mut root_maxsize = None;
for (k, v) in form_urlencoded::parse(query.as_bytes()) {
match &*k {
"rootSighash" => {
root_sighash = Some(v.parse()?);
}
"rootMaxsize" => {
root_maxsize = Some(v.parse()?);
}
_ => (),
}
}
if root_sighash.is_some() || root_maxsize.is_some() {
Ok(Some(Self {
root_sighash: root_sighash
.or_not_found("rootSighash required if rootMaxsize specified")
.with_kind(ErrorKind::InvalidRequest)?,
root_maxsize: root_maxsize
.or_not_found("rootMaxsize required if rootSighash specified")
.with_kind(ErrorKind::InvalidRequest)?,
}))
} else {
Ok(None)
}
}
}
impl Digestable for MerkleArchiveCommitment {
fn update<D: Update>(&self, digest: &mut D) {
digest.update(&*self.root_sighash);
digest.update(&u64::to_be_bytes(self.root_maxsize));
}
}
impl<'a, S: FileSource + Clone> Commitment<&'a MerkleArchive<S>> for MerkleArchiveCommitment {
async fn create(resource: &'a MerkleArchive<S>) -> Result<Self, Error> {
resource.commitment().await
}
async fn check(&self, resource: &'a MerkleArchive<S>) -> Result<(), Error> {
let MerkleArchiveCommitment {
root_sighash,
root_maxsize,
} = resource.commitment().await?;
if root_sighash != self.root_sighash {
return Err(Error::new(
eyre!("merkle root mismatch"),
ErrorKind::InvalidSignature,
));
}
if root_maxsize > self.root_maxsize {
return Err(Error::new(
eyre!("merkle root directory max size too large"),
ErrorKind::InvalidSignature,
));
}
Ok(())
}
async fn copy_to<W: AsyncWrite + Unpin + Send>(
&self,
resource: &'a MerkleArchive<S>,
writer: W,
) -> Result<(), Error> {
self.check(resource).await?;
resource
.serialize(&mut TrackingIO::new(0, writer), true)
.await
}
}
impl<'a, S: FileSource + Clone> Commitment<&'a S9pk<S>> for MerkleArchiveCommitment {
async fn create(resource: &'a S9pk<S>) -> Result<Self, Error> {
resource.as_archive().commitment().await
}
async fn check(&self, resource: &'a S9pk<S>) -> Result<(), Error> {
let MerkleArchiveCommitment {
root_sighash,
root_maxsize,
} = resource.as_archive().commitment().await?;
if root_sighash != self.root_sighash {
return Err(Error::new(
eyre!("merkle root mismatch"),
ErrorKind::InvalidSignature,
));
}
if root_maxsize > self.root_maxsize {
return Err(Error::new(
eyre!("merkle root directory max size too large"),
ErrorKind::InvalidSignature,
));
}
Ok(())
}
async fn copy_to<W: AsyncWrite + Unpin + Send>(
&self,
resource: &'a S9pk<S>,
writer: W,
) -> Result<(), Error> {
self.check(resource).await?;
resource
.clone()
.serialize(&mut TrackingIO::new(0, writer), true)
.await
}
}

View File

@@ -1,25 +0,0 @@
use digest::Update;
use futures::Future;
use tokio::io::AsyncWrite;
use crate::prelude::*;
pub mod blake3;
pub mod merkle_archive;
pub mod request;
pub trait Digestable {
fn update<D: Update>(&self, digest: &mut D);
}
pub trait Commitment<Resource>: Sized + Digestable {
fn create(resource: Resource) -> impl Future<Output = Result<Self, Error>> + Send;
fn copy_to<W: AsyncWrite + Unpin + Send>(
&self,
resource: Resource,
writer: W,
) -> impl Future<Output = Result<(), Error>> + Send;
fn check(&self, resource: Resource) -> impl Future<Output = Result<(), Error>> + Send {
self.copy_to(resource, tokio::io::sink())
}
}

View File

@@ -1,103 +0,0 @@
use std::collections::BTreeMap;
use std::time::{SystemTime, UNIX_EPOCH};
use axum::body::Body;
use axum::extract::Request;
use digest::Update;
use futures::TryStreamExt;
use http::HeaderValue;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncWrite;
use tokio_util::io::StreamReader;
use ts_rs::TS;
use url::Url;
use crate::prelude::*;
use crate::registry::signer::commitment::{Commitment, Digestable};
use crate::s9pk::merkle_archive::hash::VerifyingWriter;
use crate::util::serde::Base64;
#[derive(Clone, Debug, Deserialize, Serialize, HasModel, PartialEq, Eq, TS)]
#[serde(rename_all = "camelCase")]
#[model = "Model<Self>"]
#[ts(export)]
pub struct RequestCommitment {
#[ts(type = "number")]
pub timestamp: i64,
#[ts(type = "number")]
pub nonce: u64,
#[ts(type = "number")]
pub size: u64,
pub blake3: Base64<[u8; 32]>,
}
impl RequestCommitment {
pub fn append_query(&self, url: &mut Url) {
url.query_pairs_mut()
.append_pair("timestamp", &self.timestamp.to_string())
.append_pair("nonce", &self.nonce.to_string())
.append_pair("size", &self.size.to_string())
.append_pair("blake3", &self.blake3.to_string());
}
pub fn from_query(query: &HeaderValue) -> Result<Self, Error> {
let query: BTreeMap<_, _> = form_urlencoded::parse(query.as_bytes()).collect();
Ok(Self {
timestamp: query.get("timestamp").or_not_found("timestamp")?.parse()?,
nonce: query.get("nonce").or_not_found("nonce")?.parse()?,
size: query.get("size").or_not_found("size")?.parse()?,
blake3: query.get("blake3").or_not_found("blake3")?.parse()?,
})
}
}
impl Digestable for RequestCommitment {
fn update<D: Update>(&self, digest: &mut D) {
digest.update(&i64::to_be_bytes(self.timestamp));
digest.update(&u64::to_be_bytes(self.nonce));
digest.update(&u64::to_be_bytes(self.size));
digest.update(&*self.blake3);
}
}
impl<'a> Commitment<&'a mut Request> for RequestCommitment {
async fn create(resource: &'a mut Request) -> Result<Self, Error> {
use http_body_util::BodyExt;
let body = std::mem::replace(resource.body_mut(), Body::empty())
.collect()
.await
.with_kind(ErrorKind::Network)?
.to_bytes();
let res = Self {
timestamp: SystemTime::now()
.duration_since(UNIX_EPOCH)
.map(|d| d.as_secs() as i64)
.unwrap_or_else(|e| e.duration().as_secs() as i64 * -1),
nonce: rand::random(),
size: body.len() as u64,
blake3: Base64(*blake3::hash(&*body).as_bytes()),
};
*resource.body_mut() = Body::from(body);
Ok(res)
}
async fn copy_to<W: AsyncWrite + Unpin + Send>(
&self,
resource: &'a mut Request,
writer: W,
) -> Result<(), Error> {
use tokio::io::AsyncReadExt;
let mut body = StreamReader::new(
std::mem::replace(resource.body_mut(), Body::empty())
.into_data_stream()
.map_err(std::io::Error::other),
)
.take(self.size);
let mut writer = VerifyingWriter::new(
writer,
Some((blake3::Hash::from_bytes(*self.blake3), self.size)),
);
tokio::io::copy(&mut body, &mut writer).await?;
writer.verify().await?;
Ok(())
}
}

View File

@@ -1,34 +0,0 @@
use ed25519_dalek::{Signature, SigningKey, VerifyingKey};
use sha2::Sha512;
use crate::prelude::*;
use crate::registry::signer::sign::SignatureScheme;
pub struct Ed25519;
impl SignatureScheme for Ed25519 {
type SigningKey = SigningKey;
type VerifyingKey = VerifyingKey;
type Signature = Signature;
type Digest = Sha512;
fn new_digest(&self) -> Self::Digest {
<Self::Digest as digest::Digest>::new()
}
fn sign(
&self,
key: &Self::SigningKey,
digest: Self::Digest,
context: &str,
) -> Result<Self::Signature, Error> {
Ok(key.sign_prehashed(digest, Some(context.as_bytes()))?)
}
fn verify(
&self,
key: &Self::VerifyingKey,
digest: Self::Digest,
context: &str,
signature: &Self::Signature,
) -> Result<(), Error> {
key.verify_prehashed_strict(digest, Some(context.as_bytes()), signature)?;
Ok(())
}
}

View File

@@ -1,347 +0,0 @@
use std::fmt::Display;
use std::str::FromStr;
use ::ed25519::pkcs8::BitStringRef;
use clap::builder::ValueParserFactory;
use der::referenced::OwnedToRef;
use models::FromStrParser;
use pkcs8::der::AnyRef;
use pkcs8::{PrivateKeyInfo, SubjectPublicKeyInfo};
use serde::{Deserialize, Serialize};
use sha2::Sha512;
use ts_rs::TS;
use crate::prelude::*;
use crate::registry::signer::commitment::Digestable;
use crate::registry::signer::sign::ed25519::Ed25519;
use crate::util::serde::{deserialize_from_str, serialize_display};
pub mod ed25519;
pub trait SignatureScheme {
type SigningKey;
type VerifyingKey;
type Signature;
type Digest: digest::Update;
fn new_digest(&self) -> Self::Digest;
fn sign(
&self,
key: &Self::SigningKey,
digest: Self::Digest,
context: &str,
) -> Result<Self::Signature, Error>;
fn sign_commitment<C: Digestable>(
&self,
key: &Self::SigningKey,
commitment: &C,
context: &str,
) -> Result<Self::Signature, Error> {
let mut digest = self.new_digest();
commitment.update(&mut digest);
self.sign(key, digest, context)
}
fn verify(
&self,
key: &Self::VerifyingKey,
digest: Self::Digest,
context: &str,
signature: &Self::Signature,
) -> Result<(), Error>;
fn verify_commitment<C: Digestable>(
&self,
key: &Self::VerifyingKey,
commitment: &C,
context: &str,
signature: &Self::Signature,
) -> Result<(), Error> {
let mut digest = self.new_digest();
commitment.update(&mut digest);
self.verify(key, digest, context, signature)
}
}
pub enum AnyScheme {
Ed25519(Ed25519),
}
impl From<Ed25519> for AnyScheme {
fn from(value: Ed25519) -> Self {
Self::Ed25519(value)
}
}
impl SignatureScheme for AnyScheme {
type SigningKey = AnySigningKey;
type VerifyingKey = AnyVerifyingKey;
type Signature = AnySignature;
type Digest = AnyDigest;
fn new_digest(&self) -> Self::Digest {
match self {
Self::Ed25519(s) => AnyDigest::Sha512(s.new_digest()),
}
}
fn sign(
&self,
key: &Self::SigningKey,
digest: Self::Digest,
context: &str,
) -> Result<Self::Signature, Error> {
match (self, key, digest) {
(Self::Ed25519(s), AnySigningKey::Ed25519(key), AnyDigest::Sha512(digest)) => {
Ok(AnySignature::Ed25519(s.sign(key, digest, context)?))
}
_ => Err(Error::new(
eyre!("mismatched signature algorithm"),
ErrorKind::InvalidSignature,
)),
}
}
fn verify(
&self,
key: &Self::VerifyingKey,
digest: Self::Digest,
context: &str,
signature: &Self::Signature,
) -> Result<(), Error> {
match (self, key, digest, signature) {
(
Self::Ed25519(s),
AnyVerifyingKey::Ed25519(key),
AnyDigest::Sha512(digest),
AnySignature::Ed25519(signature),
) => s.verify(key, digest, context, signature),
_ => Err(Error::new(
eyre!("mismatched signature algorithm"),
ErrorKind::InvalidSignature,
)),
}
}
}
#[derive(Clone, Debug, PartialEq, Eq, TS)]
#[ts(export, type = "string")]
pub enum AnySigningKey {
Ed25519(<Ed25519 as SignatureScheme>::SigningKey),
}
impl AnySigningKey {
pub fn scheme(&self) -> AnyScheme {
match self {
Self::Ed25519(_) => AnyScheme::Ed25519(Ed25519),
}
}
pub fn verifying_key(&self) -> AnyVerifyingKey {
match self {
Self::Ed25519(k) => AnyVerifyingKey::Ed25519(k.into()),
}
}
}
impl<'a> TryFrom<PrivateKeyInfo<'a>> for AnySigningKey {
type Error = pkcs8::Error;
fn try_from(value: PrivateKeyInfo<'a>) -> Result<Self, Self::Error> {
if value.algorithm == ed25519_dalek::pkcs8::ALGORITHM_ID {
Ok(Self::Ed25519(ed25519_dalek::SigningKey::try_from(value)?))
} else {
Err(pkcs8::spki::Error::OidUnknown {
oid: value.algorithm.oid,
}
.into())
}
}
}
impl pkcs8::EncodePrivateKey for AnySigningKey {
fn to_pkcs8_der(&self) -> pkcs8::Result<pkcs8::SecretDocument> {
match self {
Self::Ed25519(s) => s.to_pkcs8_der(),
}
}
}
impl FromStr for AnySigningKey {
type Err = Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use pkcs8::DecodePrivateKey;
Self::from_pkcs8_pem(s).with_kind(ErrorKind::Deserialization)
}
}
impl Display for AnySigningKey {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
use pkcs8::EncodePrivateKey;
f.write_str(
&self
.to_pkcs8_pem(pkcs8::LineEnding::LF)
.map_err(|_| std::fmt::Error)?,
)
}
}
impl<'de> Deserialize<'de> for AnySigningKey {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
deserialize_from_str(deserializer)
}
}
impl Serialize for AnySigningKey {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serialize_display(self, serializer)
}
}
#[derive(Clone, Debug, PartialEq, Eq, Hash, TS)]
#[ts(export, type = "string")]
pub enum AnyVerifyingKey {
Ed25519(<Ed25519 as SignatureScheme>::VerifyingKey),
}
impl AnyVerifyingKey {
pub fn scheme(&self) -> AnyScheme {
match self {
Self::Ed25519(_) => AnyScheme::Ed25519(Ed25519),
}
}
}
impl<'a> TryFrom<SubjectPublicKeyInfo<AnyRef<'a>, BitStringRef<'a>>> for AnyVerifyingKey {
type Error = pkcs8::spki::Error;
fn try_from(
value: SubjectPublicKeyInfo<AnyRef<'a>, BitStringRef<'a>>,
) -> Result<Self, Self::Error> {
if value.algorithm == ed25519_dalek::pkcs8::ALGORITHM_ID {
Ok(Self::Ed25519(ed25519_dalek::VerifyingKey::try_from(value)?))
} else {
Err(pkcs8::spki::Error::OidUnknown {
oid: value.algorithm.oid,
})
}
}
}
impl pkcs8::EncodePublicKey for AnyVerifyingKey {
fn to_public_key_der(&self) -> pkcs8::spki::Result<pkcs8::Document> {
match self {
Self::Ed25519(s) => s.to_public_key_der(),
}
}
}
impl FromStr for AnyVerifyingKey {
type Err = Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use pkcs8::DecodePublicKey;
Self::from_public_key_pem(s).with_kind(ErrorKind::Deserialization)
}
}
impl Display for AnyVerifyingKey {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
use pkcs8::EncodePublicKey;
f.write_str(
&self
.to_public_key_pem(pkcs8::LineEnding::LF)
.map_err(|_| std::fmt::Error)?,
)
}
}
impl<'de> Deserialize<'de> for AnyVerifyingKey {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
deserialize_from_str(deserializer)
}
}
impl Serialize for AnyVerifyingKey {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serialize_display(self, serializer)
}
}
impl ValueParserFactory for AnyVerifyingKey {
type Parser = FromStrParser<Self>;
fn value_parser() -> Self::Parser {
Self::Parser::new()
}
}
#[derive(Clone, Debug)]
pub enum AnyDigest {
Sha512(Sha512),
}
impl digest::Update for AnyDigest {
fn update(&mut self, data: &[u8]) {
match self {
Self::Sha512(d) => digest::Update::update(d, data),
}
}
}
#[derive(Clone, Debug, PartialEq, Eq, TS)]
#[ts(export, type = "string")]
pub enum AnySignature {
Ed25519(<Ed25519 as SignatureScheme>::Signature),
}
impl FromStr for AnySignature {
type Err = Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use der::DecodePem;
#[derive(der::Sequence)]
struct AnySignatureDer {
alg: pkcs8::spki::AlgorithmIdentifierOwned,
sig: der::asn1::OctetString,
}
impl der::pem::PemLabel for AnySignatureDer {
const PEM_LABEL: &'static str = "SIGNATURE";
}
let der = AnySignatureDer::from_pem(s.as_bytes()).with_kind(ErrorKind::Deserialization)?;
if der.alg.oid == ed25519_dalek::pkcs8::ALGORITHM_ID.oid
&& der.alg.parameters.owned_to_ref() == ed25519_dalek::pkcs8::ALGORITHM_ID.parameters
{
Ok(Self::Ed25519(
ed25519_dalek::Signature::from_slice(der.sig.as_bytes())
.with_kind(ErrorKind::Deserialization)?,
))
} else {
Err(pkcs8::spki::Error::OidUnknown { oid: der.alg.oid })
.with_kind(ErrorKind::Deserialization)
}
}
}
impl Display for AnySignature {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
use der::EncodePem;
#[derive(der::Sequence)]
struct AnySignatureDer<'a> {
alg: pkcs8::AlgorithmIdentifierRef<'a>,
sig: der::asn1::OctetString,
}
impl<'a> der::pem::PemLabel for AnySignatureDer<'a> {
const PEM_LABEL: &'static str = "SIGNATURE";
}
f.write_str(
&match self {
Self::Ed25519(s) => AnySignatureDer {
alg: ed25519_dalek::pkcs8::ALGORITHM_ID,
sig: der::asn1::OctetString::new(s.to_bytes()).map_err(|_| std::fmt::Error)?,
},
}
.to_pem(der::pem::LineEnding::LF)
.map_err(|_| std::fmt::Error)?,
)
}
}
impl<'de> Deserialize<'de> for AnySignature {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::Deserializer<'de>,
{
deserialize_from_str(deserializer)
}
}
impl Serialize for AnySignature {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serialize_display(self, serializer)
}
}