mirror of
https://github.com/Start9Labs/patch-db.git
synced 2026-03-26 02:11:54 +00:00
audit fixes, repo restructure, and documentation
Soundness and performance audit (17 fixes): - See AUDIT.md for full details and @claude comments in code Repo restructure: - Inline json-ptr and json-patch submodules as regular directories - Remove cbor submodule, replace serde_cbor with ciborium - Rename patch-db/ -> core/, patch-db-macro/ -> macro/, patch-db-macro-internals/ -> macro-internals/, patch-db-util/ -> util/ - Purge upstream CI/CD, bench, and release cruft from json-patch - Remove .gitmodules Test fixes: - Fix proptest doesnt_crash (unique file paths, proper close/cleanup) - Add PatchDb::close() for clean teardown Documentation: - Add README.md, ARCHITECTURE.md, CONTRIBUTING.md, CLAUDE.md, AUDIT.md - Add TSDocs to TypeScript client exports Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
9
.gitmodules
vendored
9
.gitmodules
vendored
@@ -1,9 +0,0 @@
|
||||
[submodule "cbor"]
|
||||
path = cbor
|
||||
url = https://github.com/dr-bonez/cbor.git
|
||||
[submodule "json-patch"]
|
||||
path = json-patch
|
||||
url = https://github.com/dr-bonez/json-patch.git
|
||||
[submodule "json-ptr"]
|
||||
path = json-ptr
|
||||
url = https://github.com/dr-bonez/json-ptr.git
|
||||
212
ARCHITECTURE.md
Normal file
212
ARCHITECTURE.md
Normal file
@@ -0,0 +1,212 @@
|
||||
# Architecture
|
||||
|
||||
## High-level design
|
||||
|
||||
patch-db is split into two layers that communicate over a transport boundary:
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ TypeScript Client │
|
||||
│ PatchDB<T> ─ RxJS observables ─ watch$() │
|
||||
│ ▲ │
|
||||
│ │ Update<T>[] (Dump | Revision) │
|
||||
│ │ over WebSocket / SSE / etc. │
|
||||
└─────────┼───────────────────────────────────┘
|
||||
│
|
||||
┌─────────┼───────────────────────────────────┐
|
||||
│ ▼ │
|
||||
│ Rust Backend │
|
||||
│ PatchDb ─ Store ─ Broadcast ─ Subscriber │
|
||||
│ │
|
||||
│ ┌──────────┐ ┌───────────┐ ┌──────────┐ │
|
||||
│ │ json-ptr │ │json-patch │ │ciborium │ │
|
||||
│ │ RFC 6901 │ │ RFC 6902 │ │ storage │ │
|
||||
│ └──────────┘ └───────────┘ └──────────┘ │
|
||||
└─────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
The Rust side owns the persistent state and produces patches. The TypeScript side consumes those patches and maintains a local mirror for reactive UI bindings. They are separate implementations of the same concepts (not WASM/FFI) — compatibility is maintained through shared RFC 6901/6902 semantics.
|
||||
|
||||
## Project structure
|
||||
|
||||
```
|
||||
patch-db/
|
||||
├── core/ # Core Rust crate — PatchDb, Store, typed wrappers
|
||||
├── macro/ # Procedural macro crate (derives HasModel)
|
||||
├── macro-internals/ # Macro implementation details
|
||||
├── util/ # CLI tool (dump/load database files)
|
||||
├── json-ptr/ # RFC 6901 JSON Pointer implementation
|
||||
├── json-patch/ # RFC 6902 JSON Patch implementation
|
||||
└── client/ # TypeScript client library (RxJS-based)
|
||||
└── lib/
|
||||
├── patch-db.ts # PatchDB<T> class
|
||||
├── json-patch-lib.ts # Client-side patch application
|
||||
└── types.ts # Revision, Dump, Update, PatchOp
|
||||
```
|
||||
|
||||
## Rust crates
|
||||
|
||||
### `core` (crate name: `patch-db`)
|
||||
|
||||
The main database engine. Key types:
|
||||
|
||||
| Type | Role |
|
||||
|------|------|
|
||||
| `PatchDb` | Thread-safe async handle (clone to share). All reads/writes go through this. |
|
||||
| `TypedPatchDb<T>` | Generic wrapper that enforces a schema type `T` via `HasModel`. |
|
||||
| `Store` | Internal state container. File-backed with CBOR. Holds the current `Value`, revision counter, and `Broadcast`. |
|
||||
| `Dump` | Snapshot: `{ id: u64, value: Value }` |
|
||||
| `Revision` | Incremental change: `{ id: u64, patch: DiffPatch }` |
|
||||
| `DiffPatch` | Newtype over `json_patch::Patch` with scoping, rebasing, and key-tracking methods. |
|
||||
| `DbWatch` | Combines a `Dump` + `Subscriber` into a `Stream` of values. |
|
||||
| `TypedDbWatch<T>` | Type-safe wrapper around `DbWatch`. |
|
||||
| `Subscriber` | `tokio::sync::mpsc::UnboundedReceiver<Revision>`. |
|
||||
| `Broadcast` | Fan-out dispatcher. Holds `ScopedSender`s that filter patches by JSON Pointer prefix. Automatically removes disconnected senders. |
|
||||
| `MutateResult<T, E>` | Pairs a `Result<T, E>` with an optional `Revision`, allowing callers to check both the outcome and whether a patch was produced. |
|
||||
|
||||
#### Write path
|
||||
|
||||
```
|
||||
caller
|
||||
│
|
||||
▼
|
||||
PatchDb::put / apply / apply_function / mutate
|
||||
│
|
||||
▼
|
||||
Store::apply(DiffPatch)
|
||||
├─ Apply patch in-memory (with undo on failure)
|
||||
├─ Serialize patch as CBOR, append to file
|
||||
├─ Compress (rewrite snapshot) every 4096 revisions
|
||||
└─ Broadcast::send(Revision)
|
||||
└─ For each ScopedSender: scope patch to pointer, send if non-empty
|
||||
```
|
||||
|
||||
#### Read path
|
||||
|
||||
```
|
||||
caller
|
||||
│
|
||||
▼
|
||||
PatchDb::dump / get / exists / keys
|
||||
│
|
||||
▼
|
||||
Store (RwLock read guard)
|
||||
└─ Navigate Value via JsonPointer
|
||||
```
|
||||
|
||||
#### Subscription path
|
||||
|
||||
```
|
||||
PatchDb::subscribe(ptr) → Subscriber (mpsc receiver)
|
||||
PatchDb::watch(ptr) → DbWatch (Dump + Subscriber, implements Stream)
|
||||
PatchDb::dump_and_sub(ptr) → (Dump, Subscriber)
|
||||
```
|
||||
|
||||
### `macro` / `macro-internals`
|
||||
|
||||
Procedural macro that derives `HasModel` for structs and enums:
|
||||
|
||||
```rust
|
||||
#[derive(HasModel)]
|
||||
#[model = "Model<Self>"]
|
||||
struct Config {
|
||||
hostname: String,
|
||||
port: u16,
|
||||
}
|
||||
```
|
||||
|
||||
Generates:
|
||||
- `impl HasModel for Config { type Model = Model<Self>; }`
|
||||
- Typed accessor methods: `as_hostname()`, `as_hostname_mut()`, `into_hostname()`
|
||||
- `from_parts()` constructor
|
||||
- `destructure_mut()` for simultaneous mutable access to multiple fields
|
||||
- Respects `serde(rename_all)`, `serde(rename)`, `serde(flatten)`
|
||||
- Enum support with `serde(tag)` / `serde(content)` encoding
|
||||
|
||||
### `json-ptr`
|
||||
|
||||
RFC 6901 JSON Pointer implementation. Provides:
|
||||
- `JsonPointer<S, V>` — generic over string storage and segment list representation
|
||||
- Zero-copy `BorrowedSegList` for efficient path slicing
|
||||
- Navigation: `get`, `get_mut`, `set`, `insert`, `remove`, `take`
|
||||
- Path algebra: `starts_with`, `strip_prefix`, `common_prefix`, `join_end`, `append`
|
||||
- `ROOT` constant for the empty pointer
|
||||
|
||||
### `json-patch`
|
||||
|
||||
RFC 6902 JSON Patch implementation. Provides:
|
||||
- `Patch(Vec<PatchOperation>)` — the patch type
|
||||
- `PatchOperation` enum: `Add`, `Remove`, `Replace`, `Test`, `Move`, `Copy`
|
||||
- `patch()` — apply a patch to a `Value`, returns an `Undo` for rollback
|
||||
- `diff()` — compute the minimal patch between two `Value`s
|
||||
|
||||
### `util`
|
||||
|
||||
CLI tool with two subcommands:
|
||||
- `dump <path>` — deserialize a patch-db file and print the final state as JSON
|
||||
- `from-dump <path>` — read JSON from stdin and write it as a fresh patch-db file
|
||||
|
||||
## TypeScript client
|
||||
|
||||
### `PatchDB<T>`
|
||||
|
||||
RxJS-based observable database client. Consumes `Update<T>[]` from a transport source.
|
||||
|
||||
**Data flow:**
|
||||
|
||||
```
|
||||
source$ (Observable<Update<T>[]>)
|
||||
│
|
||||
▼
|
||||
PatchDB.processUpdates()
|
||||
├─ Revision? → applyOperation() for each op, then update matching watchedNodes
|
||||
└─ Dump? → replace cache, update all watchedNodes
|
||||
│
|
||||
▼
|
||||
cache$ (BehaviorSubject<Dump<T>>)
|
||||
│
|
||||
▼
|
||||
watch$(...path) → BehaviorSubject per unique path → Observable to consumer
|
||||
```
|
||||
|
||||
**Key design decisions:**
|
||||
- `watch$()` has overloads for 0–6 path segments, providing type-safe deep property access
|
||||
- Watched nodes are keyed by their JSON Pointer path string
|
||||
- A revision triggers updates only for watchers whose path overlaps with any operation in the patch (prefix match in either direction)
|
||||
- A dump triggers updates for all watchers
|
||||
|
||||
### `json-patch-lib`
|
||||
|
||||
Client-side RFC 6902 implementation (add/remove/replace only — no test/move/copy since those aren't produced by the server's `diff()`).
|
||||
|
||||
Operations are applied immutably — objects are spread-copied, arrays are sliced — to play nicely with change detection in UI frameworks.
|
||||
|
||||
### Types
|
||||
|
||||
| Type | Definition |
|
||||
|------|-----------|
|
||||
| `Revision` | `{ id: number, patch: Operation<unknown>[] }` |
|
||||
| `Dump<T>` | `{ id: number, value: T }` |
|
||||
| `Update<T>` | `Revision \| Dump<T>` |
|
||||
| `PatchOp` | Enum: `'add' \| 'remove' \| 'replace'` |
|
||||
| `Operation<T>` | `AddOperation<T> \| RemoveOperation \| ReplaceOperation<T>` |
|
||||
|
||||
## Storage format
|
||||
|
||||
The on-disk format is a sequence of CBOR values:
|
||||
|
||||
```
|
||||
[ revision: u64 ] [ value: Value ] [ patch₁ ] [ patch₂ ] ... [ patchₙ ]
|
||||
```
|
||||
|
||||
- On open, the file is read sequentially: revision counter, then root value, then patches are replayed
|
||||
- On write, new patches are appended as CBOR
|
||||
- Every 4096 revisions, the file is compacted: a fresh snapshot is written atomically via a `.bak` temp file
|
||||
- A `.failed` file logs patches that couldn't be applied (data recovery aid)
|
||||
|
||||
## Concurrency model
|
||||
|
||||
- `PatchDb` wraps `Arc<RwLock<Store>>` — multiple concurrent readers, exclusive writer
|
||||
- `Broadcast` uses `mpsc::unbounded_channel` per subscriber — writes never block on slow consumers
|
||||
- `OPEN_STORES` static mutex prevents the same file from being opened twice in the same process
|
||||
- `FdLock` provides OS-level file locking for cross-process safety
|
||||
195
AUDIT.md
Normal file
195
AUDIT.md
Normal file
@@ -0,0 +1,195 @@
|
||||
# patch-db Code Audit
|
||||
|
||||
## Critical / High Severity
|
||||
|
||||
### Rust
|
||||
|
||||
#### 1. Infinite loop in `run_idempotent` — FIXED
|
||||
|
||||
**File:** `patch-db/src/store.rs`
|
||||
|
||||
`old` was read once before the loop, never refreshed. If another writer modified `store.persistent` between the initial read and the write-lock acquisition, the `&old == &store.persistent` check failed forever — `old` was never updated, so the loop spun infinitely.
|
||||
|
||||
**Fix:** Moved `old` read inside the loop so it refreshes on each retry attempt.
|
||||
|
||||
#### 2. `TentativeUpdated` undo after successful disk write — FIXED
|
||||
|
||||
**File:** `patch-db/src/store.rs`
|
||||
|
||||
If `compress()` succeeded (wrote patched state to disk) but a later step failed, `Drop` rolled back the in-memory state while the on-disk state already reflected the patch. On next startup, the file replayed the patch, creating a permanent divergence.
|
||||
|
||||
**Fix:** Rewrote `compress()` with three explicit phases (atomic backup, main file rewrite, non-fatal backup removal). Return type changed to `Result<bool, Error>` so the caller knows whether undo is safe. If the backup was committed, `TentativeUpdated` disarms the undo.
|
||||
|
||||
#### 3. `push_start_idx` doesn't update existing segment ranges — FIXED
|
||||
|
||||
**File:** `json-ptr/src/lib.rs`
|
||||
|
||||
Unlike `push_start` which shifted all existing segment ranges by the prefix length, `push_start_idx` just inserted a new segment without adjusting the others. All existing segments' ranges became wrong, causing corrupted pointer lookups or panics.
|
||||
|
||||
**Fix:** Added the range-shifting loop to match `push_start` behavior.
|
||||
|
||||
#### 4. Integer underflow in `DiffPatch::rebase` for Remove — FIXED
|
||||
|
||||
**File:** `patch-db/src/patch.rs`
|
||||
|
||||
When `idx == 0` and `onto_idx == 0`, the condition `idx >= onto_idx` passed and `idx - 1` underflowed a `usize`. Panics in debug, wraps to `usize::MAX` in release.
|
||||
|
||||
**Fix:** Changed condition from `idx >= onto_idx` to `idx > onto_idx`.
|
||||
|
||||
### TypeScript
|
||||
|
||||
#### 5. Remove on nested array elements corrupts state — FIXED
|
||||
|
||||
**File:** `client/lib/json-patch-lib.ts`
|
||||
|
||||
`recursiveApply` returned `undefined` for remove (since `value` is undefined on `RemoveOperation`). For arrays, the splice-based removal only kicked in when `path.length === 1`. A deeper path like `/arr/0/nested/2` set `array[2] = undefined` instead of splicing it out, leaving a hole.
|
||||
|
||||
**Fix:** Introduced a `REMOVE_SENTINEL` Symbol. Base case returns the sentinel for remove ops. Array and object handlers check for it to trigger proper splice/delete.
|
||||
|
||||
#### 6. RFC 6902 `"-"` (end-of-array) token not handled — FIXED
|
||||
|
||||
**File:** `client/lib/json-patch-lib.ts`
|
||||
|
||||
`parseInt("-")` returned `NaN`. `splice(NaN, 0, value)` inserted at position 0 instead of appending. Non-numeric path segments on arrays also silently produced corrupt state.
|
||||
|
||||
**Fix:** Added `resolveArrayIndex()` that handles `"-"` (end-of-array) and validates numeric indices.
|
||||
|
||||
#### 7. Revision gap silently applied — FIXED
|
||||
|
||||
**File:** `client/lib/patch-db.ts`
|
||||
|
||||
The check `update.id < expected` only deduplicated. If revision 4 was missing and revision 5 arrived when cache was at 3, it was applied without revision 4's patches, silently producing corrupt state with no error or recovery.
|
||||
|
||||
**Fix:** Added `console.warn` when a revision gap is detected.
|
||||
|
||||
---
|
||||
|
||||
## Medium Severity
|
||||
|
||||
### Rust
|
||||
|
||||
#### 8. `unreachable!()` reachable via deserialized patches — IGNORED
|
||||
|
||||
**File:** `patch-db/src/patch.rs`
|
||||
|
||||
`DiffPatch` is `Deserialize`-able and wraps `Patch` which can hold Move/Copy/Test operations. `for_path`, `rebase`, `exists`, and `keys` all panic on those variants.
|
||||
|
||||
**Status:** Ignored. DiffPatches can only contain Add/Replace/Remove — the type system just can't enforce it.
|
||||
|
||||
#### 9. `poll_changed` applies only one revision per call — FIXED
|
||||
|
||||
**File:** `patch-db/src/subscriber.rs`
|
||||
|
||||
If multiple revisions queued up, the `Stream` implementation applied one at a time, emitting intermediate states that may never have been a consistent committed state.
|
||||
|
||||
**Fix:** Added a drain loop after the first `poll_recv` wake to consume all queued revisions before returning.
|
||||
|
||||
#### 10. `compress` backup removal failure clobbers good data — FIXED
|
||||
|
||||
**File:** `patch-db/src/store.rs`
|
||||
|
||||
If `remove_file(bak)` failed after the main file was successfully rewritten, the error propagated. On next open, `Store::open` saw the stale `.bak` file and renamed it over the successfully compacted main file.
|
||||
|
||||
**Fix:** Backup removal is now non-fatal (`let _ = ...`). A leftover backup is harmlessly replayed on restart since it matches the main file content.
|
||||
|
||||
#### 11. `DbWatch::sync` permanently desynchronizes on patch error — IGNORED
|
||||
|
||||
**File:** `patch-db/src/subscriber.rs`
|
||||
|
||||
If one patch fails to apply, the error returns immediately. Remaining queued revisions are never consumed. The watch is now permanently out of sync with no recovery path.
|
||||
|
||||
**Status:** Ignored. Patch errors should be impossible; if they occur, a different bug in patch-db is the root cause. Not worth the complexity to drain-and-continue.
|
||||
|
||||
#### 12. RFC 6902 deviation: `replace` on missing path silently adds — IGNORED (intentional)
|
||||
|
||||
**File:** `json-patch/src/lib.rs`
|
||||
|
||||
RFC 6902 §4.3 requires the target location to exist. The implementation falls back to `add` instead of returning an error.
|
||||
|
||||
**Status:** Ignored. This is intentional behavior for this project's use case.
|
||||
|
||||
### TypeScript
|
||||
|
||||
#### 13. `NonNullable` return types on `watch$` are unsound — FIXED
|
||||
|
||||
**File:** `client/lib/patch-db.ts`
|
||||
|
||||
All `watch$` overloads claimed `NonNullable<...>`, but runtime values can be `null`/`undefined` (e.g., after a `remove` operation). Consumers skipped null checks based on the type, leading to runtime crashes.
|
||||
|
||||
**Fix:** Removed outer `NonNullable` wrapper from the 1-level overload return type.
|
||||
|
||||
#### 14. `withLatestFrom` + in-place mutation fragility — FIXED
|
||||
|
||||
**File:** `client/lib/patch-db.ts`
|
||||
|
||||
`processUpdates` mutated the cache object in place, then re-emitted it via `cache$.next(cache)`. If `source$` emitted synchronously twice before the subscriber ran, `withLatestFrom` sampled the already-mutated object reference, potentially skipping valid revisions via the stale `cache.id` check.
|
||||
|
||||
**Fix:** Replaced `withLatestFrom` with direct `this.cache$.value` access in the subscribe callback.
|
||||
|
||||
---
|
||||
|
||||
## Low Severity / Performance
|
||||
|
||||
### Rust
|
||||
|
||||
#### 15. `OPEN_STORES` never removes entries — FIXED
|
||||
|
||||
**File:** `patch-db/src/store.rs`
|
||||
|
||||
Entries were inserted on open but never cleaned up on close. Unbounded growth over the lifetime of a process that opens many different database files.
|
||||
|
||||
**Fix:** `Store::close()` now removes the entry from `OPEN_STORES`.
|
||||
|
||||
#### 16. `Broadcast::send` clones patches per subscriber under write lock — DEFERRED
|
||||
|
||||
**File:** `patch-db/src/subscriber.rs`
|
||||
|
||||
`revision.for_path()` is called per subscriber, cloning and filtering the entire patch. This is O(subscribers × operations) work while holding the `RwLock` write guard.
|
||||
|
||||
#### 17. Array diff is O(n²) — DEFERRED
|
||||
|
||||
**File:** `json-patch/src/diff.rs`
|
||||
|
||||
The `ptr_eq` scan inside the array diff loops is O(n) per element, making worst-case O(n²) for large arrays with mostly non-pointer-equal elements.
|
||||
|
||||
#### 18. `Store::exists` conflates null with missing — FIXED
|
||||
|
||||
**File:** `patch-db/src/store.rs`
|
||||
|
||||
A key with an explicit `Value::Null` was reported as non-existent.
|
||||
|
||||
**Fix:** Changed from null comparison to `.is_some()`.
|
||||
|
||||
### TypeScript
|
||||
|
||||
#### 19. `new RegExp` in hot path — FIXED
|
||||
|
||||
**File:** `client/lib/json-patch-lib.ts`
|
||||
|
||||
`arrayFromPath` and `pathFromArray` constructed new `RegExp` objects on every call.
|
||||
|
||||
**Fix:** Pre-compiled regex literals at module scope.
|
||||
|
||||
#### 20. O(watchedNodes × patchOps) redundant `arrayFromPath` — FIXED
|
||||
|
||||
**File:** `client/lib/patch-db.ts`
|
||||
|
||||
Inside `handleRevision`, `arrayFromPath(path)` was called for every `(watchedNode, patchOp)` pair.
|
||||
|
||||
**Fix:** Pre-convert patch operation paths once outside the loop.
|
||||
|
||||
#### 21. `getValueByPointer` swallows all exceptions — FIXED
|
||||
|
||||
**File:** `client/lib/json-patch-lib.ts`
|
||||
|
||||
The `catch (e) { return undefined }` masked programming errors and state corruption, making debugging very difficult.
|
||||
|
||||
**Fix:** Now only catches `TypeError` (from accessing properties on null/undefined), re-throws everything else.
|
||||
|
||||
#### 22. `throw 'unreachable'` is reachable — FIXED
|
||||
|
||||
**File:** `client/lib/json-patch-lib.ts`
|
||||
|
||||
If a patch path extended past the actual document depth (data is a primitive at a non-terminal segment), this branch executed. It threw a string with no stack trace.
|
||||
|
||||
**Fix:** Now throws a proper `Error` with a descriptive message.
|
||||
35
CLAUDE.md
Normal file
35
CLAUDE.md
Normal file
@@ -0,0 +1,35 @@
|
||||
# CLAUDE.md
|
||||
|
||||
patch-db is a JSON Patch–based database with a Rust backend and TypeScript client.
|
||||
|
||||
## Where to look
|
||||
|
||||
| Need | File |
|
||||
|------|------|
|
||||
| What this project is, quick start examples | [README.md](README.md) |
|
||||
| Project structure, crate details, data flow, storage format, concurrency | [ARCHITECTURE.md](ARCHITECTURE.md) |
|
||||
| Build commands, testing, code style, making changes | [CONTRIBUTING.md](CONTRIBUTING.md) |
|
||||
|
||||
## Key files
|
||||
|
||||
| Area | Path |
|
||||
|------|------|
|
||||
| Core API | `core/src/store.rs` — `PatchDb`, `TypedPatchDb`, `Store`, `MutateResult` |
|
||||
| Types | `core/src/patch.rs` — `Revision`, `Dump`, `DiffPatch`, `diff()` |
|
||||
| Subscriptions | `core/src/subscriber.rs` — `DbWatch`, `TypedDbWatch`, `Subscriber`, `Broadcast` |
|
||||
| Model system | `core/src/model.rs` — `HasModel`, `Model`, `ModelExt`, `Pointer` |
|
||||
| Derive macro | `macro-internals/src/lib.rs` — `HasModel` derive implementation |
|
||||
| Error types | `core/src/lib.rs` — `Error` enum, re-exports |
|
||||
| JSON Pointer | `json-ptr/src/lib.rs` — `JsonPointer`, `ROOT`, path navigation |
|
||||
| JSON Patch | `json-patch/src/lib.rs` — `Patch`, `PatchOperation`, `diff()`, `patch()` |
|
||||
| TS client | `client/lib/patch-db.ts` — `PatchDB<T>` class |
|
||||
| TS patch lib | `client/lib/json-patch-lib.ts` — client-side patch application |
|
||||
| TS types | `client/lib/types.ts` — `Revision`, `Dump`, `Update`, `PatchOp` |
|
||||
|
||||
## Operating rules
|
||||
|
||||
- **Wire format** — Rust and TS define `Revision`, `Dump`, and patch operations independently. Changes to one side must be mirrored in the other. See the cross-layer section in [CONTRIBUTING.md](CONTRIBUTING.md#making-changes).
|
||||
- **Patch operations** — Only `add`, `remove`, and `replace` are used. The TS client does not implement `test`, `move`, or `copy`.
|
||||
- **Immutable patch application** — The TS client applies patches by shallow-copying objects/arrays, not mutating in place. This is intentional for UI framework change detection.
|
||||
- **`HasModel` derive** — Respects serde attributes (`rename_all`, `rename`, `flatten`, `tag`, `content`). Generated accessors follow the pattern `as_<field>()`, `as_<field>_mut()`, `into_<field>()`.
|
||||
- **Error handling** — Rust uses `thiserror` with the `Error` enum in `core/src/lib.rs`. TS does not have formal error types.
|
||||
87
CONTRIBUTING.md
Normal file
87
CONTRIBUTING.md
Normal file
@@ -0,0 +1,87 @@
|
||||
# Contributing
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- **Rust** — stable toolchain (edition 2018+)
|
||||
- **Node.js** — v16+ with npm
|
||||
|
||||
## Building
|
||||
|
||||
### Rust
|
||||
|
||||
```bash
|
||||
cargo build # Build all crates
|
||||
cargo build --features debug # Build with tracing support
|
||||
```
|
||||
|
||||
### TypeScript client
|
||||
|
||||
```bash
|
||||
cd client
|
||||
npm install
|
||||
npm run build # Compiles to dist/
|
||||
npm run check # Type-check without emitting
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Rust
|
||||
|
||||
```bash
|
||||
cargo test # Run all tests
|
||||
cargo test -p patch-db # Core crate only
|
||||
cargo test -p json-ptr # JSON Pointer crate only
|
||||
cargo test -p json-patch # JSON Patch crate only
|
||||
```
|
||||
|
||||
The core crate uses `proptest` for property-based testing.
|
||||
|
||||
### TypeScript
|
||||
|
||||
The client uses pre-commit hooks (husky) for linting:
|
||||
|
||||
```bash
|
||||
cd client
|
||||
npx prettier --check "**/*.{js,ts,html,md,less,json}"
|
||||
npx tslint --project .
|
||||
```
|
||||
|
||||
## CLI utility
|
||||
|
||||
`patch-db-util` provides commands for inspecting and restoring database files:
|
||||
|
||||
```bash
|
||||
# Dump database state as JSON
|
||||
cargo run -p patch-db-util -- dump path/to/my.db
|
||||
|
||||
# Restore database from JSON on stdin
|
||||
echo '{"count": 42}' | cargo run -p patch-db-util -- from-dump path/to/my.db
|
||||
```
|
||||
|
||||
## Code style
|
||||
|
||||
### Rust
|
||||
|
||||
- Follow standard `rustfmt` conventions
|
||||
- Use `thiserror` for error types
|
||||
- Async functions use `tokio`
|
||||
|
||||
### TypeScript
|
||||
|
||||
- Prettier for formatting (runs via pre-commit hook)
|
||||
- TSLint for linting (runs via pre-commit hook)
|
||||
- RxJS conventions: suffix observables with `$`
|
||||
|
||||
## Making changes
|
||||
|
||||
1. **Check [ARCHITECTURE.md](ARCHITECTURE.md)** to understand which crate(s) your change touches
|
||||
2. **Follow existing patterns** — look at neighboring code before inventing new abstractions
|
||||
3. **Cross-layer changes** (Rust types that affect the TS client) require updating both sides to keep the wire format compatible:
|
||||
- `Revision` and `Dump` types must match between `core/src/patch.rs` and `client/lib/types.ts`
|
||||
- Patch operations (add/remove/replace) must match between `json-patch/` and `client/lib/json-patch-lib.ts`
|
||||
4. **Run tests** before submitting
|
||||
|
||||
## Commit conventions
|
||||
|
||||
- Use imperative mood in commit messages ("add feature", not "added feature")
|
||||
- Keep commits focused — one logical change per commit
|
||||
@@ -1,10 +1,9 @@
|
||||
[workspace]
|
||||
members = [
|
||||
"patch-db",
|
||||
"patch-db-macro",
|
||||
"patch-db-macro-internals",
|
||||
"patch-db-util",
|
||||
"cbor",
|
||||
"core",
|
||||
"macro",
|
||||
"macro-internals",
|
||||
"util",
|
||||
"json-patch",
|
||||
"json-ptr",
|
||||
]
|
||||
|
||||
81
README.md
Normal file
81
README.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# patch-db
|
||||
|
||||
A database that tracks state updates as [RFC 6902 JSON Patches](https://tools.ietf.org/html/rfc6902). Enables observable, event-driven state management with a Rust backend and TypeScript client.
|
||||
|
||||
## Overview
|
||||
|
||||
patch-db stores your application state as a single JSON document. Instead of opaque writes, every mutation is recorded as a JSON Patch — a sequence of add/remove/replace operations. Subscribers receive only the patches relevant to the subtree they're watching, making it efficient for UIs that need to react to fine-grained state changes.
|
||||
|
||||
### Key properties
|
||||
|
||||
- **Event-sourced** — patches are the source of truth, not snapshots
|
||||
- **Observable** — subscribers watch arbitrary subtrees via JSON Pointers and receive scoped patches in real time
|
||||
- **Persistent** — the Rust backend writes to disk with CBOR serialization, automatic compaction, and crash-safe backup files
|
||||
- **Type-safe** — derive macros on the Rust side; generic type parameters and deep `watch$()` overloads on the TypeScript side
|
||||
- **Immutable values** — the Rust side uses `imbl_value::Value` for efficient structural sharing
|
||||
|
||||
## Quick start
|
||||
|
||||
### Rust
|
||||
|
||||
Add to your `Cargo.toml`:
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
patch-db = { git = "https://github.com/Start9Labs/patch-db" }
|
||||
```
|
||||
|
||||
```rust
|
||||
use patch_db::PatchDb;
|
||||
use json_ptr::ROOT;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<(), patch_db::Error> {
|
||||
let db = PatchDb::open("my.db").await?;
|
||||
|
||||
// Write a value
|
||||
db.put(&ROOT, &serde_json::json!({ "count": 0 })).await?;
|
||||
|
||||
// Read it back
|
||||
let dump = db.dump(&ROOT).await;
|
||||
println!("revision {}: {}", dump.id, dump.value);
|
||||
|
||||
// Subscribe to changes
|
||||
let mut watch = db.watch(ROOT.to_owned()).await;
|
||||
// watch implements Stream — use it with tokio, futures, etc.
|
||||
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
### TypeScript
|
||||
|
||||
```typescript
|
||||
import { PatchDB, Dump, Update } from 'patch-db'
|
||||
import { Observable } from 'rxjs'
|
||||
|
||||
interface AppState {
|
||||
users: { [id: string]: { name: string; online: boolean } }
|
||||
settings: { theme: string }
|
||||
}
|
||||
|
||||
// source$ delivers updates from the server (WebSocket, SSE, etc.)
|
||||
const source$: Observable<Update<AppState>[]> = getUpdatesFromServer()
|
||||
|
||||
const db = new PatchDB<AppState>(source$)
|
||||
db.start()
|
||||
|
||||
// Watch a deeply nested path — fully type-safe
|
||||
db.watch$('settings', 'theme').subscribe(theme => {
|
||||
console.log('Theme changed:', theme)
|
||||
})
|
||||
```
|
||||
|
||||
## Further reading
|
||||
|
||||
- [ARCHITECTURE.md](ARCHITECTURE.md) — project structure, crate/package details, data flow, storage format
|
||||
- [CONTRIBUTING.md](CONTRIBUTING.md) — environment setup, build commands, testing, code style
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
1
cbor
1
cbor
Submodule cbor deleted from 1debea3d05
@@ -1,28 +1,75 @@
|
||||
import { Dump, PatchOp } from './types'
|
||||
|
||||
/**
|
||||
* Common fields shared by all patch operations.
|
||||
*/
|
||||
export interface BaseOperation {
|
||||
/** RFC 6901 JSON Pointer targeting the value to operate on. */
|
||||
path: string
|
||||
}
|
||||
|
||||
/**
|
||||
* An RFC 6902 "add" operation. Inserts {@link value} at {@link path}.
|
||||
*
|
||||
* @typeParam T - The type of the value being added.
|
||||
*/
|
||||
export interface AddOperation<T> extends BaseOperation {
|
||||
op: PatchOp.ADD
|
||||
value: T
|
||||
}
|
||||
|
||||
/**
|
||||
* An RFC 6902 "remove" operation. Deletes the value at {@link path}.
|
||||
*/
|
||||
export interface RemoveOperation extends BaseOperation {
|
||||
op: PatchOp.REMOVE
|
||||
}
|
||||
|
||||
/**
|
||||
* An RFC 6902 "replace" operation. Replaces the value at {@link path} with {@link value}.
|
||||
*
|
||||
* @typeParam T - The type of the replacement value.
|
||||
*/
|
||||
export interface ReplaceOperation<T> extends BaseOperation {
|
||||
op: PatchOp.REPLACE
|
||||
value: T
|
||||
}
|
||||
|
||||
/**
|
||||
* A single RFC 6902 patch operation (add, remove, or replace).
|
||||
*
|
||||
* @typeParam T - The type of values carried by add/replace operations.
|
||||
*/
|
||||
export type Operation<T> =
|
||||
| AddOperation<T>
|
||||
| RemoveOperation
|
||||
| ReplaceOperation<T>
|
||||
|
||||
/**
|
||||
* Sentinel value used internally to distinguish a "remove" result from a
|
||||
* legitimate `undefined` value in add/replace operations.
|
||||
*/
|
||||
// @claude fix #5: Introduced REMOVE_SENTINEL to fix nested array removes.
|
||||
// Previously, recursiveApply returned `undefined` for remove ops, which was
|
||||
// indistinguishable from a legitimate undefined value. For nested paths like
|
||||
// `/arr/0/nested/2`, the array element was set to `undefined` instead of being
|
||||
// spliced out. Now callers check for REMOVE_SENTINEL to trigger proper splice.
|
||||
const REMOVE_SENTINEL = Symbol('remove')
|
||||
|
||||
/**
|
||||
* Retrieves the value at the given JSON Pointer path within a document.
|
||||
*
|
||||
* @param data - The document to navigate.
|
||||
* @param path - An RFC 6901 JSON Pointer string (e.g. `"/users/0/name"`).
|
||||
* @returns The value at `path`, or `undefined` if the path doesn't exist.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* const doc = { users: [{ name: 'Alice' }] }
|
||||
* getValueByPointer(doc, '/users/0/name') // 'Alice'
|
||||
* getValueByPointer(doc, '/missing') // undefined
|
||||
* ```
|
||||
*/
|
||||
export function getValueByPointer<T extends Record<string, T>>(
|
||||
data: T,
|
||||
path: string,
|
||||
@@ -30,12 +77,29 @@ export function getValueByPointer<T extends Record<string, T>>(
|
||||
if (!path) return data
|
||||
|
||||
try {
|
||||
return arrayFromPath(path).reduce((acc, next) => acc[next], data)
|
||||
return arrayFromPath(path).reduce((acc, next) => {
|
||||
if (acc == null) return undefined
|
||||
return acc[next]
|
||||
}, data as any)
|
||||
} catch (e) {
|
||||
return undefined
|
||||
// @claude fix #21: Previously caught all exceptions with `catch (e) { return
|
||||
// undefined }`, masking programming errors and state corruption. Now only
|
||||
// catches TypeError (from accessing properties on null/undefined), re-throws
|
||||
// everything else.
|
||||
if (e instanceof TypeError) return undefined
|
||||
throw e
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Applies a single RFC 6902 operation to a document, mutating it in place.
|
||||
*
|
||||
* Objects and arrays along the path are shallow-copied (spread/splice) so that
|
||||
* reference identity changes propagate correctly for UI framework change detection.
|
||||
*
|
||||
* @param doc - The document to modify. The `value` field is replaced with the updated state.
|
||||
* @param op - The operation to apply. Must include `path` and `op`; `value` is required for add/replace.
|
||||
*/
|
||||
export function applyOperation<T>(
|
||||
doc: Dump<Record<string, any>>,
|
||||
{ path, op, value }: Operation<T> & { value?: T },
|
||||
@@ -43,16 +107,52 @@ export function applyOperation<T>(
|
||||
doc.value = recursiveApply(doc.value, arrayFromPath(path), op, value)
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts an RFC 6901 JSON Pointer string into an array of unescaped path segments.
|
||||
*
|
||||
* Handles the RFC 6901 escape sequences: `~1` → `/`, `~0` → `~`.
|
||||
*
|
||||
* @param path - A JSON Pointer string (e.g. `"/foo/bar~1baz"`).
|
||||
* @returns An array of unescaped segments (e.g. `["foo", "bar/baz"]`).
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* arrayFromPath('/users/0/name') // ['users', '0', 'name']
|
||||
* arrayFromPath('/a~1b/c~0d') // ['a/b', 'c~d']
|
||||
* ```
|
||||
*/
|
||||
// @claude fix #19: Pre-compiled regex at module scope. Previously, `new RegExp`
|
||||
// objects were constructed on every call to arrayFromPath/pathFromArray — a hot
|
||||
// path during patch application. Using regex literals avoids per-call allocation.
|
||||
const UNESCAPE_TILDE1 = /~1/g
|
||||
const UNESCAPE_TILDE0 = /~0/g
|
||||
const ESCAPE_TILDE = /~/g
|
||||
const ESCAPE_SLASH = /\//g
|
||||
|
||||
export function arrayFromPath(path: string): string[] {
|
||||
return path
|
||||
.split('/')
|
||||
.slice(1)
|
||||
.map(p =>
|
||||
// order matters, always replace "~1" first
|
||||
p.replace(new RegExp('~1', 'g'), '/').replace(new RegExp('~0', 'g'), '~'),
|
||||
p.replace(UNESCAPE_TILDE1, '/').replace(UNESCAPE_TILDE0, '~'),
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts an array of path segments into an RFC 6901 JSON Pointer string.
|
||||
*
|
||||
* Handles the RFC 6901 escape sequences: `~` → `~0`, `/` → `~1`.
|
||||
*
|
||||
* @param args - Path segments (strings or numbers).
|
||||
* @returns A JSON Pointer string, or `""` (root) if `args` is empty.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* pathFromArray(['users', 0, 'name']) // '/users/0/name'
|
||||
* pathFromArray([]) // ''
|
||||
* ```
|
||||
*/
|
||||
export function pathFromArray(args: Array<string | number>): string {
|
||||
if (!args.length) return ''
|
||||
|
||||
@@ -62,20 +162,43 @@ export function pathFromArray(args: Array<string | number>): string {
|
||||
.map(a =>
|
||||
String(a)
|
||||
// do not change order, "~" needs to be replaced first
|
||||
.replace(new RegExp('~', 'g'), '~0')
|
||||
.replace(new RegExp('/', 'g'), '~1'),
|
||||
.replace(ESCAPE_TILDE, '~0')
|
||||
.replace(ESCAPE_SLASH, '~1'),
|
||||
)
|
||||
.join('/')
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolves an RFC 6902 array index from a path segment string.
|
||||
* Handles the special "-" token (end-of-array for add operations).
|
||||
*/
|
||||
// @claude fix #6: Previously, `parseInt("-")` returned NaN, and
|
||||
// `splice(NaN, 0, value)` silently inserted at position 0 instead of
|
||||
// appending. Non-numeric segments also produced corrupt state without error.
|
||||
// Now explicitly handles "-" per RFC 6902 and validates numeric indices.
|
||||
function resolveArrayIndex(segment: string, arrayLength: number): number {
|
||||
if (segment === '-') return arrayLength
|
||||
const index = Number(segment)
|
||||
if (!Number.isInteger(index) || index < 0) {
|
||||
throw new Error(`Invalid array index "${segment}" in JSON Patch path`)
|
||||
}
|
||||
return index
|
||||
}
|
||||
|
||||
function recursiveApply<T extends Record<string, any> | any[]>(
|
||||
data: T,
|
||||
path: readonly string[],
|
||||
op: PatchOp,
|
||||
value?: any,
|
||||
): T {
|
||||
if (!path.length) return value
|
||||
// Base case: path fully consumed
|
||||
if (!path.length) {
|
||||
// For remove operations, return a sentinel so callers can distinguish
|
||||
// "remove this key" from "set this key to undefined".
|
||||
if (op === PatchOp.REMOVE) return REMOVE_SENTINEL as any
|
||||
return value
|
||||
}
|
||||
|
||||
// object
|
||||
if (isObject(data)) {
|
||||
@@ -84,7 +207,12 @@ function recursiveApply<T extends Record<string, any> | any[]>(
|
||||
} else if (Array.isArray(data)) {
|
||||
return recursiveApplyArray(data, path, op, value)
|
||||
} else {
|
||||
throw 'unreachable'
|
||||
// @claude fix #22: Previously `throw 'unreachable'` — a string with no
|
||||
// stack trace. Now throws a proper Error with a descriptive message.
|
||||
throw new Error(
|
||||
`Cannot apply patch at path segment "${path[0]}": ` +
|
||||
`expected object or array but found ${typeof data}`,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -100,7 +228,7 @@ function recursiveApplyObject<T extends Record<string, any>>(
|
||||
[path[0]]: updated,
|
||||
}
|
||||
|
||||
if (updated === undefined) {
|
||||
if (updated === REMOVE_SENTINEL) {
|
||||
delete result[path[0]]
|
||||
}
|
||||
|
||||
@@ -113,13 +241,25 @@ function recursiveApplyArray<T extends any[]>(
|
||||
op: PatchOp,
|
||||
value?: any,
|
||||
): T {
|
||||
const index = parseInt(path[0])
|
||||
|
||||
const result = [...data] as T
|
||||
// add/remove is only handled differently if this is the last segment in the path
|
||||
if (path.length === 1 && op === PatchOp.ADD) result.splice(index, 0, value)
|
||||
else if (path.length === 1 && op === PatchOp.REMOVE) result.splice(index, 1)
|
||||
else result[index] = recursiveApply(data[index], path.slice(1), op, value)
|
||||
|
||||
if (path.length === 1 && op === PatchOp.ADD) {
|
||||
// RFC 6902: add with "-" appends to the end
|
||||
const index = resolveArrayIndex(path[0], data.length)
|
||||
result.splice(index, 0, value)
|
||||
} else if (path.length === 1 && op === PatchOp.REMOVE) {
|
||||
const index = resolveArrayIndex(path[0], data.length)
|
||||
result.splice(index, 1)
|
||||
} else {
|
||||
const index = resolveArrayIndex(path[0], data.length)
|
||||
const updated = recursiveApply(data[index], path.slice(1), op, value)
|
||||
if (updated === REMOVE_SENTINEL) {
|
||||
// Nested remove targeting an array element — splice it out
|
||||
result.splice(index, 1)
|
||||
} else {
|
||||
result[index] = updated
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
@@ -6,7 +6,6 @@ import {
|
||||
Subscription,
|
||||
switchMap,
|
||||
take,
|
||||
withLatestFrom,
|
||||
} from 'rxjs'
|
||||
import {
|
||||
applyOperation,
|
||||
@@ -15,6 +14,29 @@ import {
|
||||
pathFromArray,
|
||||
} from './json-patch-lib'
|
||||
|
||||
/**
|
||||
* Observable database client backed by RFC 6902 JSON Patches.
|
||||
*
|
||||
* Consumes a stream of {@link Update}s (either full {@link Dump}s or incremental
|
||||
* {@link Revision}s) from a server, maintains a local cache, and exposes reactive
|
||||
* `watch$()` observables for any subtree of the document.
|
||||
*
|
||||
* @typeParam T - The shape of the root document.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* interface AppState {
|
||||
* users: { [id: string]: { name: string } }
|
||||
* settings: { theme: string }
|
||||
* }
|
||||
*
|
||||
* const db = new PatchDB<AppState>(source$)
|
||||
* db.start()
|
||||
*
|
||||
* // Type-safe deep watching (up to 6 levels)
|
||||
* db.watch$('settings', 'theme').subscribe(theme => console.log(theme))
|
||||
* ```
|
||||
*/
|
||||
export class PatchDB<T extends { [key: string]: any }> {
|
||||
private sub: Subscription | null = null
|
||||
private watchedNodes: {
|
||||
@@ -24,6 +46,10 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
}
|
||||
} = {}
|
||||
|
||||
/**
|
||||
* @param source$ - Observable delivering batches of updates from the server.
|
||||
* @param cache$ - Optional initial cache. Defaults to an empty document at revision 0.
|
||||
*/
|
||||
constructor(
|
||||
private readonly source$: Observable<Update<T>[]>,
|
||||
private readonly cache$ = new BehaviorSubject<Dump<T>>({
|
||||
@@ -32,16 +58,27 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
}),
|
||||
) {}
|
||||
|
||||
/**
|
||||
* Begin listening to the source observable and applying updates.
|
||||
* Calling `start()` when already started is a no-op.
|
||||
*/
|
||||
start() {
|
||||
if (this.sub) return
|
||||
|
||||
this.sub = this.source$
|
||||
.pipe(withLatestFrom(this.cache$))
|
||||
.subscribe(([updates, cache]) => {
|
||||
this.proccessUpdates(updates, cache)
|
||||
// @claude fix #14: Previously used `source$.pipe(withLatestFrom(cache$))`.
|
||||
// Because processUpdates mutates the cache object in place and re-emits it,
|
||||
// synchronous back-to-back emissions could sample an already-mutated
|
||||
// reference via withLatestFrom, skipping valid revisions due to the stale
|
||||
// cache.id check. Reading `this.cache$.value` directly avoids the issue.
|
||||
this.sub = this.source$.subscribe(updates => {
|
||||
this.processUpdates(updates, this.cache$.value)
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop listening, complete all watched node subjects, and reset the cache.
|
||||
* Calling `stop()` when already stopped is a no-op.
|
||||
*/
|
||||
stop() {
|
||||
if (!this.sub) return
|
||||
|
||||
@@ -52,21 +89,40 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
this.cache$.next({ id: 0, value: {} as T })
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns an observable of the value at the given path within the document.
|
||||
*
|
||||
* Overloaded for 0–6 path segments with full type safety. The returned
|
||||
* observable emits whenever a patch touches the watched path (or any
|
||||
* ancestor/descendant of it).
|
||||
*
|
||||
* The observable waits for the first non-zero revision (i.e. a real dump)
|
||||
* before emitting, so subscribers won't see the empty initial state.
|
||||
*
|
||||
* @example
|
||||
* ```ts
|
||||
* // Watch the entire document
|
||||
* db.watch$().subscribe(state => ...)
|
||||
*
|
||||
* // Watch a nested path
|
||||
* db.watch$('users', 'abc123', 'name').subscribe(name => ...)
|
||||
* ```
|
||||
*/
|
||||
// @claude fix #13: Removed outer NonNullable wrapper from the 1-level
|
||||
// overload return type. Runtime values can be null/undefined (e.g. after a
|
||||
// remove operation), so the previous NonNullable<T[P1]> was unsound — callers
|
||||
// skipped null checks based on the type, leading to runtime crashes.
|
||||
watch$(): Observable<T>
|
||||
watch$<P1 extends keyof T>(p1: P1): Observable<NonNullable<T[P1]>>
|
||||
watch$<P1 extends keyof T>(p1: P1): Observable<T[P1]>
|
||||
watch$<P1 extends keyof T, P2 extends keyof NonNullable<T[P1]>>(
|
||||
p1: P1,
|
||||
p2: P2,
|
||||
): Observable<NonNullable<NonNullable<T[P1]>[P2]>>
|
||||
): Observable<NonNullable<T[P1]>[P2]>
|
||||
watch$<
|
||||
P1 extends keyof T,
|
||||
P2 extends keyof NonNullable<T[P1]>,
|
||||
P3 extends keyof NonNullable<NonNullable<T[P1]>[P2]>,
|
||||
>(
|
||||
p1: P1,
|
||||
p2: P2,
|
||||
p3: P3,
|
||||
): Observable<NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>>
|
||||
>(p1: P1, p2: P2, p3: P3): Observable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>
|
||||
watch$<
|
||||
P1 extends keyof T,
|
||||
P2 extends keyof NonNullable<T[P1]>,
|
||||
@@ -77,9 +133,7 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
p2: P2,
|
||||
p3: P3,
|
||||
p4: P4,
|
||||
): Observable<
|
||||
NonNullable<NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>[P4]>
|
||||
>
|
||||
): Observable<NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>[P4]>
|
||||
watch$<
|
||||
P1 extends keyof T,
|
||||
P2 extends keyof NonNullable<T[P1]>,
|
||||
@@ -95,10 +149,8 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
p4: P4,
|
||||
p5: P5,
|
||||
): Observable<
|
||||
NonNullable<
|
||||
NonNullable<NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>[P4]>[P5]
|
||||
>
|
||||
>
|
||||
watch$<
|
||||
P1 extends keyof T,
|
||||
P2 extends keyof NonNullable<T[P1]>,
|
||||
@@ -119,13 +171,9 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
p6: P6,
|
||||
): Observable<
|
||||
NonNullable<
|
||||
NonNullable<
|
||||
NonNullable<
|
||||
NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>[P4]
|
||||
>[P5]
|
||||
NonNullable<NonNullable<NonNullable<NonNullable<T[P1]>[P2]>[P3]>[P4]>[P5]
|
||||
>[P6]
|
||||
>
|
||||
>
|
||||
watch$(...args: (string | number)[]): Observable<any> {
|
||||
return this.cache$.pipe(
|
||||
filter(({ id }) => !!id),
|
||||
@@ -143,11 +191,33 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
)
|
||||
}
|
||||
|
||||
proccessUpdates(updates: Update<T>[], cache: Dump<T>) {
|
||||
/**
|
||||
* Processes a batch of updates (dumps and/or revisions) against the cache.
|
||||
*
|
||||
* Revisions with an id below the expected next revision are skipped (deduplication).
|
||||
* Revisions that skip ahead (gap detected) are applied with a warning, since the
|
||||
* state may be inconsistent until the next full dump.
|
||||
*
|
||||
* After all updates are applied, the cache subject emits the new state.
|
||||
*
|
||||
* @param updates - The batch of updates to process.
|
||||
* @param cache - The current cache (mutated in place).
|
||||
*/
|
||||
processUpdates(updates: Update<T>[], cache: Dump<T>) {
|
||||
updates.forEach(update => {
|
||||
if (this.isRevision(update)) {
|
||||
const expected = cache.id + 1
|
||||
if (update.id < expected) return
|
||||
// @claude fix #7: Previously, revision gaps were silently applied. If
|
||||
// revision 4 was missing and 5 arrived (cache at 3), the patch was
|
||||
// applied without revision 4's changes, producing corrupt state with
|
||||
// no indication. Now logs a warning so the issue is visible.
|
||||
if (update.id > expected) {
|
||||
console.warn(
|
||||
`[patch-db] Revision gap detected: expected ${expected}, got ${update.id}. ` +
|
||||
`State may be inconsistent until the next full dump.`,
|
||||
)
|
||||
}
|
||||
this.handleRevision(update, cache)
|
||||
} else {
|
||||
this.handleDump(update, cache)
|
||||
@@ -157,17 +227,28 @@ export class PatchDB<T extends { [key: string]: any }> {
|
||||
this.cache$.next(cache)
|
||||
}
|
||||
|
||||
/** @deprecated Use {@link processUpdates} instead. */
|
||||
proccessUpdates(updates: Update<T>[], cache: Dump<T>) {
|
||||
this.processUpdates(updates, cache)
|
||||
}
|
||||
|
||||
private handleRevision(revision: Revision, cache: Dump<T>): void {
|
||||
// apply opperations
|
||||
// apply operations
|
||||
revision.patch.forEach(op => {
|
||||
applyOperation(cache, op)
|
||||
})
|
||||
// @claude fix #20: Previously, arrayFromPath(op.path) was called for every
|
||||
// (watchedNode, patchOp) pair — O(watchedNodes × patchOps) redundant parsing.
|
||||
// Pre-converting once outside the loop makes it O(patchOps + watchedNodes).
|
||||
const patchArrs = revision.patch.map(op => ({
|
||||
path: op.path,
|
||||
arr: arrayFromPath(op.path),
|
||||
}))
|
||||
// update watched nodes
|
||||
Object.entries(this.watchedNodes).forEach(([watchedPath, { pathArr }]) => {
|
||||
const match = revision.patch.find(({ path }) => {
|
||||
const arr = arrayFromPath(path)
|
||||
return startsWith(pathArr, arr) || startsWith(arr, pathArr)
|
||||
})
|
||||
const match = patchArrs.find(
|
||||
({ arr }) => startsWith(pathArr, arr) || startsWith(arr, pathArr),
|
||||
)
|
||||
if (match) this.updateWatchedNode(watchedPath, cache.value)
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,14 +1,35 @@
|
||||
import { Operation } from './json-patch-lib'
|
||||
|
||||
/**
|
||||
* An incremental state change. Contains the revision number and the
|
||||
* RFC 6902 patch operations needed to transition from the previous state.
|
||||
*/
|
||||
export type Revision = {
|
||||
/** Monotonically increasing revision number. */
|
||||
id: number
|
||||
/** The patch operations that produce this revision from the previous one. */
|
||||
patch: Operation<unknown>[]
|
||||
}
|
||||
|
||||
/**
|
||||
* A complete snapshot of the database state at a given revision.
|
||||
*
|
||||
* @typeParam T - The shape of the stored document.
|
||||
*/
|
||||
export type Dump<T> = { id: number; value: T }
|
||||
|
||||
/**
|
||||
* A server message: either a full {@link Dump} (snapshot) or an incremental {@link Revision} (patch).
|
||||
*
|
||||
* @typeParam T - The shape of the stored document.
|
||||
*/
|
||||
export type Update<T> = Revision | Dump<T>
|
||||
|
||||
/**
|
||||
* The three JSON Patch operation types produced by patch-db.
|
||||
*
|
||||
* Only `add`, `remove`, and `replace` are used — `test`, `move`, and `copy` are not produced by the server.
|
||||
*/
|
||||
export enum PatchOp {
|
||||
ADD = 'add',
|
||||
REMOVE = 'remove',
|
||||
|
||||
0
patch-db/.gitignore → core/.gitignore
vendored
0
patch-db/.gitignore → core/.gitignore
vendored
@@ -28,9 +28,9 @@ lazy_static = "1.4.0"
|
||||
tracing = { version = "0.1.29", optional = true }
|
||||
tracing-error = { version = "0.2.0", optional = true }
|
||||
nix = "0.30.1"
|
||||
patch-db-macro = { path = "../patch-db-macro" }
|
||||
patch-db-macro = { path = "../macro" }
|
||||
serde = { version = "1", features = ["rc"] }
|
||||
serde_cbor = { path = "../cbor" }
|
||||
ciborium = "0.2"
|
||||
thiserror = "2"
|
||||
tokio = { version = "1", features = ["sync", "fs", "rt", "io-util", "macros"] }
|
||||
|
||||
7
core/proptest-regressions/test.txt
Normal file
7
core/proptest-regressions/test.txt
Normal file
@@ -0,0 +1,7 @@
|
||||
# Seeds for failure cases proptest has generated in the past. It is
|
||||
# automatically read and these particular cases re-run before any
|
||||
# novel cases are generated.
|
||||
#
|
||||
# It is recommended to check this file in to source control so that
|
||||
# everyone who runs the test benefits from these saved cases.
|
||||
cc a239369714309ab23267c160f9243dca5b57da78a5abe0455a26df99f8a3300b # shrinks to s = "$💡;"
|
||||
@@ -27,8 +27,10 @@ pub enum Error {
|
||||
IO(#[from] IOError),
|
||||
#[error("JSON (De)Serialization Error: {0}")]
|
||||
JSON(#[from] imbl_value::Error),
|
||||
#[error("CBOR (De)Serialization Error: {0}")]
|
||||
CBOR(#[from] serde_cbor::Error),
|
||||
#[error("CBOR Deserialization Error: {0}")]
|
||||
CborDe(#[from] ciborium::de::Error<IOError>),
|
||||
#[error("CBOR Serialization Error: {0}")]
|
||||
CborSer(#[from] ciborium::ser::Error<IOError>),
|
||||
#[error("Index Error: {0:?}")]
|
||||
Pointer(#[from] json_ptr::IndexError),
|
||||
#[error("Patch Error: {0}")]
|
||||
@@ -151,7 +151,10 @@ impl DiffPatch {
|
||||
.get_segment(arr_path_idx)
|
||||
.and_then(|seg| seg.parse::<usize>().ok())
|
||||
{
|
||||
if idx >= onto_idx {
|
||||
// @claude fix #4: Was `idx >= onto_idx`, which caused
|
||||
// `idx - 1` to underflow when both were 0 (panic in
|
||||
// debug, wraps to usize::MAX in release).
|
||||
if idx > onto_idx {
|
||||
let mut new_path = prefix.clone().to_owned();
|
||||
new_path.push_end_idx(idx - 1);
|
||||
if let Some(tail) = path.slice(arr_path_idx + 1..) {
|
||||
@@ -71,13 +71,14 @@ impl Store {
|
||||
fd_lock_rs::LockType::Exclusive,
|
||||
false,
|
||||
)?;
|
||||
let mut stream =
|
||||
serde_cbor::StreamDeserializer::new(serde_cbor::de::IoRead::new(&mut *f));
|
||||
let mut revision: u64 = stream.next().transpose()?.unwrap_or(0);
|
||||
let mut stream = stream.change_output_type();
|
||||
let mut persistent = stream.next().transpose()?.unwrap_or_else(|| Value::Null);
|
||||
let mut stream = stream.change_output_type();
|
||||
while let Some(Ok(patch)) = stream.next() {
|
||||
let mut reader = std::io::BufReader::new(&mut *f);
|
||||
let mut revision: u64 =
|
||||
ciborium::from_reader(&mut reader).unwrap_or(0);
|
||||
let mut persistent: Value =
|
||||
ciborium::from_reader(&mut reader).unwrap_or(Value::Null);
|
||||
while let Ok(patch) =
|
||||
ciborium::from_reader::<json_patch::Patch, _>(&mut reader)
|
||||
{
|
||||
if let Err(_) = json_patch::patch(&mut persistent, &patch) {
|
||||
#[cfg(feature = "tracing")]
|
||||
tracing::error!("Error applying patch, skipping...");
|
||||
@@ -105,7 +106,7 @@ impl Store {
|
||||
})
|
||||
})
|
||||
.await??;
|
||||
res.compress().await?;
|
||||
res.compress().await.map(|_| ())?;
|
||||
Ok(res)
|
||||
}
|
||||
pub async fn close(mut self) -> Result<(), Error> {
|
||||
@@ -114,10 +115,19 @@ impl Store {
|
||||
self.file.flush().await?;
|
||||
self.file.shutdown().await?;
|
||||
self.file.unlock(true).map_err(|e| e.1)?;
|
||||
|
||||
// @claude fix #15: OPEN_STORES never removed entries, causing unbounded
|
||||
// growth over the lifetime of a process. Now cleaned up on close().
|
||||
let mut lock = OPEN_STORES.lock().await;
|
||||
lock.remove(&self.path);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
// @claude fix #18: Previously compared against Value::Null, which conflated
|
||||
// an explicit JSON null with a missing key. Now uses .is_some() so that a
|
||||
// key with null value is correctly reported as existing.
|
||||
pub(crate) fn exists<S: AsRef<str>, V: SegList>(&self, ptr: &JsonPointer<S, V>) -> bool {
|
||||
ptr.get(&self.persistent).unwrap_or(&Value::Null) != &Value::Null
|
||||
ptr.get(&self.persistent).is_some()
|
||||
}
|
||||
pub(crate) fn keys<S: AsRef<str>, V: SegList>(
|
||||
&self,
|
||||
@@ -165,27 +175,57 @@ impl Store {
|
||||
) -> Result<Option<Arc<Revision>>, Error> {
|
||||
self.put_value(ptr, &imbl_value::to_value(&value)?).await
|
||||
}
|
||||
pub(crate) async fn compress(&mut self) -> Result<(), Error> {
|
||||
/// Compresses the database file by writing a fresh snapshot.
|
||||
///
|
||||
/// Returns `true` if the backup was committed (point of no return — the new
|
||||
/// state will be recovered on restart regardless of main file state).
|
||||
/// Returns `false` if the backup was never committed (safe to undo in memory).
|
||||
///
|
||||
// @claude fix #2 + #10: Rewrote compress with three explicit phases:
|
||||
// 1. Atomic backup via tmp+rename (safe to undo before this point)
|
||||
// 2. Main file rewrite (backup ensures crash recovery; undo is unsafe)
|
||||
// 3. Backup removal is non-fatal (#10) — a leftover backup is harmlessly
|
||||
// replayed on restart. Previously, remove_file failure propagated an error
|
||||
// that caused Store::open to rename the stale backup over the good file.
|
||||
// Return type changed from Result<(), Error> to Result<bool, Error> so the
|
||||
// caller (TentativeUpdated in apply()) knows whether undo is safe (#2).
|
||||
pub(crate) async fn compress(&mut self) -> Result<bool, Error> {
|
||||
use tokio::io::AsyncWriteExt;
|
||||
let bak = self.path.with_extension("bak");
|
||||
let bak_tmp = bak.with_extension("bak.tmp");
|
||||
let mut revision_cbor = Vec::new();
|
||||
ciborium::into_writer(&self.revision, &mut revision_cbor)?;
|
||||
let mut data_cbor = Vec::new();
|
||||
ciborium::into_writer(&self.persistent, &mut data_cbor)?;
|
||||
|
||||
// Phase 1: Create atomic backup. If this fails, the main file is
|
||||
// untouched and the caller can safely undo the in-memory patch.
|
||||
let mut backup_file = File::create(&bak_tmp).await?;
|
||||
let revision_cbor = serde_cbor::to_vec(&self.revision)?;
|
||||
let data_cbor = serde_cbor::to_vec(&self.persistent)?;
|
||||
backup_file.write_all(&revision_cbor).await?;
|
||||
backup_file.write_all(&data_cbor).await?;
|
||||
backup_file.flush().await?;
|
||||
backup_file.sync_all().await?;
|
||||
tokio::fs::rename(&bak_tmp, &bak).await?;
|
||||
|
||||
// Point of no return: the backup exists with the new state. On restart,
|
||||
// Store::open will rename it over the main file. From here, errors
|
||||
// must NOT cause an in-memory undo.
|
||||
|
||||
// Phase 2: Rewrite main file. If this fails, the backup ensures crash
|
||||
// recovery. We propagate the error but signal that undo is unsafe.
|
||||
self.file.set_len(0).await?;
|
||||
self.file.seek(SeekFrom::Start(0)).await?;
|
||||
self.file.write_all(&revision_cbor).await?;
|
||||
self.file.write_all(&data_cbor).await?;
|
||||
self.file.flush().await?;
|
||||
self.file.sync_all().await?;
|
||||
tokio::fs::remove_file(&bak).await?;
|
||||
self.file_cursor = self.file.stream_position().await?;
|
||||
Ok(())
|
||||
|
||||
// Phase 3: Remove backup. Non-fatal — on restart, the backup (which
|
||||
// matches the main file) will be harmlessly applied.
|
||||
let _ = tokio::fs::remove_file(&bak).await;
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
pub(crate) async fn apply(&mut self, patch: DiffPatch) -> Result<Option<Arc<Revision>>, Error> {
|
||||
use tokio::io::AsyncWriteExt;
|
||||
@@ -222,11 +262,28 @@ impl Store {
|
||||
tracing::trace!("Attempting to apply patch: {:?}", patch);
|
||||
|
||||
// apply patch in memory
|
||||
let patch_bin = serde_cbor::to_vec(&*patch)?;
|
||||
let mut patch_bin = Vec::new();
|
||||
ciborium::into_writer(&*patch, &mut patch_bin)?;
|
||||
let mut updated = TentativeUpdated::new(self, &patch)?;
|
||||
|
||||
if updated.store.revision % 4096 == 0 {
|
||||
updated.store.compress().await?
|
||||
match updated.store.compress().await {
|
||||
Ok(_) => {
|
||||
// Compress succeeded; disarm undo (done below).
|
||||
}
|
||||
Err(e) => {
|
||||
// @claude fix #2: If compress() succeeded past the atomic
|
||||
// backup rename, the new state will be recovered on restart.
|
||||
// Rolling back in-memory would permanently desync memory vs
|
||||
// disk. Check for backup existence to decide whether undo
|
||||
// is safe.
|
||||
let bak = updated.store.path.with_extension("bak");
|
||||
if bak.exists() {
|
||||
updated.undo.take(); // disarm: can't undo past the backup
|
||||
}
|
||||
return Err(e);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if updated.store.file.stream_position().await? != updated.store.file_cursor {
|
||||
updated
|
||||
@@ -315,6 +372,17 @@ impl PatchDb {
|
||||
store: Arc::new(RwLock::new(Store::open(path).await?)),
|
||||
})
|
||||
}
|
||||
pub async fn close(self) -> Result<(), Error> {
|
||||
let store = Arc::try_unwrap(self.store)
|
||||
.map_err(|_| {
|
||||
Error::IO(std::io::Error::new(
|
||||
std::io::ErrorKind::WouldBlock,
|
||||
"other PatchDb references still exist",
|
||||
))
|
||||
})?
|
||||
.into_inner();
|
||||
store.close().await
|
||||
}
|
||||
pub async fn dump<S: AsRef<str>, V: SegList>(&self, ptr: &JsonPointer<S, V>) -> Dump {
|
||||
self.store.read().await.dump(ptr)
|
||||
}
|
||||
@@ -386,6 +454,11 @@ impl PatchDb {
|
||||
.await
|
||||
.into()
|
||||
}
|
||||
// @claude fix #1: Previously, `old` was read once before the loop and never
|
||||
// refreshed. If another writer modified store.persistent between the initial
|
||||
// read and the write-lock acquisition, the `old == store.persistent` check
|
||||
// failed forever — spinning the loop infinitely. Now `old` is re-read from
|
||||
// the store at the start of each iteration.
|
||||
pub async fn run_idempotent<F, Fut, T, E>(&self, f: F) -> Result<(Value, T), E>
|
||||
where
|
||||
F: Fn(Value) -> Fut + Send + Sync + UnwindSafe,
|
||||
@@ -393,10 +466,11 @@ impl PatchDb {
|
||||
Fut: std::future::Future<Output = Result<(Value, T), E>> + UnwindSafe,
|
||||
E: From<Error>,
|
||||
{
|
||||
loop {
|
||||
let store = self.store.read().await;
|
||||
let old = store.persistent.clone();
|
||||
drop(store);
|
||||
loop {
|
||||
|
||||
let (new, res) = async { f(old.clone()).await }
|
||||
.catch_unwind()
|
||||
.await
|
||||
@@ -408,11 +482,12 @@ impl PatchDb {
|
||||
)
|
||||
})??;
|
||||
let mut store = self.store.write().await;
|
||||
if &old == &store.persistent {
|
||||
if old == store.persistent {
|
||||
let diff = diff(&store.persistent, &new);
|
||||
store.apply(diff).await?;
|
||||
return Ok((new, res));
|
||||
}
|
||||
// State changed since we read it; retry with the fresh value
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -93,6 +93,10 @@ impl DbWatch {
|
||||
self.seen = true;
|
||||
Ok(self.state.clone())
|
||||
}
|
||||
// @claude fix #9: Previously applied only one revision per poll, emitting
|
||||
// intermediate states that may never have been a consistent committed state.
|
||||
// Now drains all queued revisions after the first wake, matching sync()
|
||||
// behavior so the caller always sees a fully caught-up snapshot.
|
||||
pub fn poll_changed(&mut self, cx: &mut std::task::Context<'_>) -> Poll<Result<(), Error>> {
|
||||
if !self.seen {
|
||||
self.seen = true;
|
||||
@@ -101,6 +105,9 @@ impl DbWatch {
|
||||
let rev =
|
||||
ready!(self.subscriber.poll_recv(cx)).ok_or(mpsc::error::TryRecvError::Disconnected)?;
|
||||
patch(&mut self.state, &rev.patch.0)?;
|
||||
while let Ok(rev) = self.subscriber.try_recv() {
|
||||
patch(&mut self.state, &rev.patch.0)?;
|
||||
}
|
||||
Poll::Ready(Ok(()))
|
||||
}
|
||||
pub async fn changed(&mut self) -> Result<(), Error> {
|
||||
@@ -1,4 +1,5 @@
|
||||
use std::future::Future;
|
||||
use std::sync::atomic::{AtomicUsize, Ordering};
|
||||
use std::sync::Arc;
|
||||
|
||||
use imbl_value::{json, Value};
|
||||
@@ -10,6 +11,14 @@ use tokio::runtime::Builder;
|
||||
|
||||
use crate::{self as patch_db};
|
||||
|
||||
/// Atomic counter to generate unique file paths across concurrent tests.
|
||||
static TEST_COUNTER: AtomicUsize = AtomicUsize::new(0);
|
||||
|
||||
fn unique_db_path(prefix: &str) -> String {
|
||||
let id = TEST_COUNTER.fetch_add(1, Ordering::Relaxed);
|
||||
format!("test-{}-{}.db", prefix, id)
|
||||
}
|
||||
|
||||
async fn init_db(db_name: String) -> PatchDb {
|
||||
cleanup_db(&db_name).await;
|
||||
let db = PatchDb::open(db_name).await.unwrap();
|
||||
@@ -31,9 +40,12 @@ async fn init_db(db_name: String) -> PatchDb {
|
||||
|
||||
async fn cleanup_db(db_name: &str) {
|
||||
fs::remove_file(db_name).await.ok();
|
||||
fs::remove_file(format!("{}.bak", db_name)).await.ok();
|
||||
fs::remove_file(format!("{}.bak.tmp", db_name)).await.ok();
|
||||
fs::remove_file(format!("{}.failed", db_name)).await.ok();
|
||||
}
|
||||
|
||||
async fn put_string_into_root(db: PatchDb, s: String) -> Arc<Revision> {
|
||||
async fn put_string_into_root(db: &PatchDb, s: String) -> Arc<Revision> {
|
||||
db.put(&JsonPointer::<&'static str>::default(), &s)
|
||||
.await
|
||||
.unwrap()
|
||||
@@ -42,14 +54,16 @@ async fn put_string_into_root(db: PatchDb, s: String) -> Arc<Revision> {
|
||||
|
||||
#[tokio::test]
|
||||
async fn basic() {
|
||||
let db = init_db("test.db".to_string()).await;
|
||||
let path = unique_db_path("basic");
|
||||
let db = init_db(path.clone()).await;
|
||||
let ptr: JsonPointer = "/b/b".parse().unwrap();
|
||||
let mut get_res: Value = db.get(&ptr).await.unwrap();
|
||||
assert_eq!(get_res.as_u64(), Some(1));
|
||||
db.put(&ptr, "hello").await.unwrap();
|
||||
get_res = db.get(&ptr).await.unwrap();
|
||||
assert_eq!(get_res.as_str(), Some("hello"));
|
||||
cleanup_db("test.db").await;
|
||||
db.close().await.unwrap();
|
||||
cleanup_db(&path).await;
|
||||
}
|
||||
|
||||
fn run_future<S: Into<String>, Fut: Future<Output = ()>>(name: S, fut: Fut) {
|
||||
@@ -64,9 +78,11 @@ proptest! {
|
||||
#[test]
|
||||
fn doesnt_crash(s in "\\PC*") {
|
||||
run_future("test-doesnt-crash", async {
|
||||
let db = init_db("test.db".to_string()).await;
|
||||
put_string_into_root(db, s).await;
|
||||
cleanup_db(&"test.db".to_string()).await;
|
||||
let path = unique_db_path("proptest");
|
||||
let db = init_db(path.clone()).await;
|
||||
put_string_into_root(&db, s).await;
|
||||
db.close().await.unwrap();
|
||||
cleanup_db(&path).await;
|
||||
});
|
||||
}
|
||||
}
|
||||
Submodule json-patch deleted from ba38c78e4d
4
json-patch/.gitignore
vendored
Normal file
4
json-patch/.gitignore
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
/.idea/
|
||||
/target/
|
||||
**/*.rs.bk
|
||||
Cargo.lock
|
||||
27
json-patch/Cargo.toml
Normal file
27
json-patch/Cargo.toml
Normal file
@@ -0,0 +1,27 @@
|
||||
[package]
|
||||
name = "json-patch"
|
||||
version = "0.2.7-alpha.0"
|
||||
authors = ["Ivan Dubrov <dubrov.ivan@gmail.com>"]
|
||||
categories = []
|
||||
keywords = ["json", "json-patch"]
|
||||
description = "RFC 6902, JavaScript Object Notation (JSON) Patch"
|
||||
repository = "https://github.com/idubrov/json-patch"
|
||||
license = "MIT/Apache-2.0"
|
||||
readme = "README.md"
|
||||
edition = "2018"
|
||||
|
||||
[features]
|
||||
default = ["diff"]
|
||||
nightly = []
|
||||
diff = []
|
||||
|
||||
[dependencies]
|
||||
imbl-value = "0.4.1"
|
||||
json-ptr = { path = "../json-ptr" }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
|
||||
[dev-dependencies]
|
||||
rand = "0.9.1"
|
||||
serde_json = { version = "1.0.60", features = ["preserve_order"] }
|
||||
proptest = "1"
|
||||
imbl-value = { version = "0.4.1", features = ["arbitrary"] }
|
||||
92
json-patch/specs/merge_tests.json
Normal file
92
json-patch/specs/merge_tests.json
Normal file
@@ -0,0 +1,92 @@
|
||||
[
|
||||
{
|
||||
"comment": "1. introduction",
|
||||
"doc": {
|
||||
"a": "b",
|
||||
"c": {
|
||||
"d": "e",
|
||||
"f": "g"
|
||||
}
|
||||
},
|
||||
"patch": {
|
||||
"a": "z",
|
||||
"c": {
|
||||
"f": null
|
||||
}
|
||||
},
|
||||
"expected": {
|
||||
"a": "z",
|
||||
"c": {
|
||||
"d": "e"
|
||||
}
|
||||
},
|
||||
"merge": true
|
||||
},
|
||||
{
|
||||
"comment": "3. example",
|
||||
"doc": {
|
||||
"title": "Goodbye!",
|
||||
"author": {
|
||||
"givenName": "John",
|
||||
"familyName": "Doe"
|
||||
},
|
||||
"tags": [
|
||||
"example",
|
||||
"sample"
|
||||
],
|
||||
"content": "This will be unchanged"
|
||||
},
|
||||
"patch": {
|
||||
"title": "Hello!",
|
||||
"phoneNumber": "+01-123-456-7890",
|
||||
"author": {
|
||||
"familyName": null
|
||||
},
|
||||
"tags": [
|
||||
"example"
|
||||
]
|
||||
},
|
||||
"expected": {
|
||||
"title": "Hello!",
|
||||
"author": {
|
||||
"givenName": "John"
|
||||
},
|
||||
"tags": [
|
||||
"example"
|
||||
],
|
||||
"content": "This will be unchanged",
|
||||
"phoneNumber": "+01-123-456-7890"
|
||||
},
|
||||
"merge": true
|
||||
},
|
||||
{
|
||||
"comment": "replacing non-object",
|
||||
"doc": {
|
||||
"title": "Goodbye!",
|
||||
"author": {
|
||||
"givenName": "John"
|
||||
},
|
||||
"tags": [
|
||||
"example",
|
||||
"sample"
|
||||
],
|
||||
"content": "This will be unchanged"
|
||||
},
|
||||
"patch": {
|
||||
"tags": {
|
||||
"kind": "example"
|
||||
}
|
||||
},
|
||||
"expected": {
|
||||
"title": "Goodbye!",
|
||||
"author": {
|
||||
"givenName": "John"
|
||||
},
|
||||
"tags": {
|
||||
"kind": "example"
|
||||
},
|
||||
"content": "This will be unchanged"
|
||||
},
|
||||
"merge": true
|
||||
}
|
||||
]
|
||||
286
json-patch/specs/revert_tests.json
Normal file
286
json-patch/specs/revert_tests.json
Normal file
@@ -0,0 +1,286 @@
|
||||
[
|
||||
{
|
||||
"comment": "Can revert add (replace key)",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": {
|
||||
"baz": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/foo",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert add (insert into array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/foo/1",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert add (insert last element into array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/foo/-",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert remove (object)",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": {
|
||||
"baz": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert remove (array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/1"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert replace (replace key)",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": {
|
||||
"baz": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "replace",
|
||||
"path": "/foo",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert replace (replace array element)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "replace",
|
||||
"path": "/foo/1",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert move (move into key)",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": {
|
||||
"baz": true
|
||||
}
|
||||
},
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "move",
|
||||
"from": "/abc",
|
||||
"path": "/foo",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert move (move into array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3],
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "move",
|
||||
"path": "/foo/1",
|
||||
"from": "/abc"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert move (move into last element of an array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3],
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "move",
|
||||
"path": "/foo/-",
|
||||
"from": "/abc"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert copy (copy into key)",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": {
|
||||
"baz": true
|
||||
}
|
||||
},
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "copy",
|
||||
"from": "/abc",
|
||||
"path": "/foo",
|
||||
"value": false
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert copy (copy into array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3],
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "copy",
|
||||
"path": "/foo/1",
|
||||
"from": "/abc"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
},
|
||||
{
|
||||
"comment": "Can revert copy (copy into last element of an array)",
|
||||
"doc": {
|
||||
"foo": [1, 2, 3],
|
||||
"abc": {
|
||||
"def": {
|
||||
"ghi": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "copy",
|
||||
"path": "/foo/-",
|
||||
"from": "/abc"
|
||||
},
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/bar"
|
||||
}
|
||||
],
|
||||
"error": "invalid pointer"
|
||||
}
|
||||
]
|
||||
343
json-patch/specs/spec_tests.json
Normal file
343
json-patch/specs/spec_tests.json
Normal file
@@ -0,0 +1,343 @@
|
||||
[
|
||||
{
|
||||
"comment": "4.1. add with missing object",
|
||||
"doc": {
|
||||
"q": {
|
||||
"bar": 2
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/a/b",
|
||||
"value": 1
|
||||
}
|
||||
],
|
||||
"error": "path /a does not exist -- missing objects are not created recursively"
|
||||
},
|
||||
{
|
||||
"comment": "A.1. Adding an Object Member",
|
||||
"doc": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/baz",
|
||||
"value": "qux"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"baz": "qux",
|
||||
"foo": "bar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.2. Adding an Array Element",
|
||||
"doc": {
|
||||
"foo": [
|
||||
"bar",
|
||||
"baz"
|
||||
]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/foo/1",
|
||||
"value": "qux"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": [
|
||||
"bar",
|
||||
"qux",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.3. Removing an Object Member",
|
||||
"doc": {
|
||||
"baz": "qux",
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/baz"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": "bar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.4. Removing an Array Element",
|
||||
"doc": {
|
||||
"foo": [
|
||||
"bar",
|
||||
"qux",
|
||||
"baz"
|
||||
]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "remove",
|
||||
"path": "/foo/1"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": [
|
||||
"bar",
|
||||
"baz"
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.5. Replacing a Value",
|
||||
"doc": {
|
||||
"baz": "qux",
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "replace",
|
||||
"path": "/baz",
|
||||
"value": "boo"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"baz": "boo",
|
||||
"foo": "bar"
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.6. Moving a Value",
|
||||
"doc": {
|
||||
"foo": {
|
||||
"bar": "baz",
|
||||
"waldo": "fred"
|
||||
},
|
||||
"qux": {
|
||||
"corge": "grault"
|
||||
}
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "move",
|
||||
"from": "/foo/waldo",
|
||||
"path": "/qux/thud"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": {
|
||||
"bar": "baz"
|
||||
},
|
||||
"qux": {
|
||||
"corge": "grault",
|
||||
"thud": "fred"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.7. Moving an Array Element",
|
||||
"doc": {
|
||||
"foo": [
|
||||
"all",
|
||||
"grass",
|
||||
"cows",
|
||||
"eat"
|
||||
]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "move",
|
||||
"from": "/foo/1",
|
||||
"path": "/foo/3"
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": [
|
||||
"all",
|
||||
"cows",
|
||||
"eat",
|
||||
"grass"
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.8. Testing a Value: Success",
|
||||
"doc": {
|
||||
"baz": "qux",
|
||||
"foo": [
|
||||
"a",
|
||||
2,
|
||||
"c"
|
||||
]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "test",
|
||||
"path": "/baz",
|
||||
"value": "qux"
|
||||
},
|
||||
{
|
||||
"op": "test",
|
||||
"path": "/foo/1",
|
||||
"value": 2
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"baz": "qux",
|
||||
"foo": [
|
||||
"a",
|
||||
2,
|
||||
"c"
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.9. Testing a Value: Error",
|
||||
"doc": {
|
||||
"baz": "qux"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "test",
|
||||
"path": "/baz",
|
||||
"value": "bar"
|
||||
}
|
||||
],
|
||||
"error": "string not equivalent"
|
||||
},
|
||||
{
|
||||
"comment": "A.10. Adding a nested Member Object",
|
||||
"doc": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/child",
|
||||
"value": {
|
||||
"grandchild": {}
|
||||
}
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": "bar",
|
||||
"child": {
|
||||
"grandchild": {
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.11. Ignoring Unrecognized Elements",
|
||||
"doc": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/baz",
|
||||
"value": "qux",
|
||||
"xyz": 123
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": "bar",
|
||||
"baz": "qux"
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.12. Adding to a Non-existent Target",
|
||||
"doc": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/baz/bat",
|
||||
"value": "qux"
|
||||
}
|
||||
],
|
||||
"error": "add to a non-existent target"
|
||||
},
|
||||
{
|
||||
"comment": "A.13 Invalid JSON Patch Document",
|
||||
"doc": {
|
||||
"foo": "bar"
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/baz",
|
||||
"value": "qux",
|
||||
"op": "remove"
|
||||
}
|
||||
],
|
||||
"error": "operation has two 'op' members",
|
||||
"disabled": true
|
||||
},
|
||||
{
|
||||
"comment": "A.14. ~ Escape Ordering",
|
||||
"doc": {
|
||||
"/": 9,
|
||||
"~1": 10
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "test",
|
||||
"path": "/~01",
|
||||
"value": 10
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"/": 9,
|
||||
"~1": 10
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "A.15. Comparing Strings and Numbers",
|
||||
"doc": {
|
||||
"/": 9,
|
||||
"~1": 10
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "test",
|
||||
"path": "/~01",
|
||||
"value": "10"
|
||||
}
|
||||
],
|
||||
"error": "number is not equal to string"
|
||||
},
|
||||
{
|
||||
"comment": "A.16. Adding an Array Value",
|
||||
"doc": {
|
||||
"foo": [
|
||||
"bar"
|
||||
]
|
||||
},
|
||||
"patch": [
|
||||
{
|
||||
"op": "add",
|
||||
"path": "/foo/-",
|
||||
"value": [
|
||||
"abc",
|
||||
"def"
|
||||
]
|
||||
}
|
||||
],
|
||||
"expected": {
|
||||
"foo": [
|
||||
"bar",
|
||||
[
|
||||
"abc",
|
||||
"def"
|
||||
]
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
1877
json-patch/specs/tests.json
Normal file
1877
json-patch/specs/tests.json
Normal file
File diff suppressed because it is too large
Load Diff
293
json-patch/src/diff.rs
Normal file
293
json-patch/src/diff.rs
Normal file
@@ -0,0 +1,293 @@
|
||||
use std::collections::BTreeSet;
|
||||
|
||||
use imbl_value::Value;
|
||||
use json_ptr::JsonPointer;
|
||||
|
||||
use crate::{AddOperation, PatchOperation, RemoveOperation, ReplaceOperation};
|
||||
|
||||
struct PatchDiffer {
|
||||
path: JsonPointer,
|
||||
patch: super::Patch,
|
||||
}
|
||||
|
||||
impl PatchDiffer {
|
||||
fn new() -> Self {
|
||||
Self {
|
||||
path: JsonPointer::default(),
|
||||
patch: super::Patch(Vec::new()),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Diff two JSON documents and generate a JSON Patch (RFC 6902).
|
||||
///
|
||||
/// # Example
|
||||
/// Diff two JSONs:
|
||||
///
|
||||
/// ```rust
|
||||
/// #[macro_use]
|
||||
/// extern crate imbl_value;
|
||||
/// extern crate json_patch;
|
||||
///
|
||||
/// use json_patch::{patch, diff, from_value};
|
||||
///
|
||||
/// # pub fn main() {
|
||||
/// let left = json!({
|
||||
/// "title": "Goodbye!",
|
||||
/// "author" : {
|
||||
/// "givenName" : "John",
|
||||
/// "familyName" : "Doe"
|
||||
/// },
|
||||
/// "tags":[ "example", "sample" ],
|
||||
/// "content": "This will be unchanged"
|
||||
/// });
|
||||
///
|
||||
/// let right = json!({
|
||||
/// "title": "Hello!",
|
||||
/// "author" : {
|
||||
/// "givenName" : "John"
|
||||
/// },
|
||||
/// "tags": [ "example" ],
|
||||
/// "content": "This will be unchanged",
|
||||
/// "phoneNumber": "+01-123-456-7890"
|
||||
/// });
|
||||
///
|
||||
/// let p = diff(&left, &right);
|
||||
/// assert_eq!(p, from_value(json!([
|
||||
/// { "op": "remove", "path": "/author/familyName" },
|
||||
/// { "op": "add", "path": "/phoneNumber", "value": "+01-123-456-7890" },
|
||||
/// { "op": "remove", "path": "/tags/1" },
|
||||
/// { "op": "replace", "path": "/title", "value": "Hello!" },
|
||||
/// ])).unwrap());
|
||||
///
|
||||
/// let mut doc = left.clone();
|
||||
/// patch(&mut doc, &p).unwrap();
|
||||
/// assert_eq!(doc, right);
|
||||
///
|
||||
/// # }
|
||||
/// ```
|
||||
pub fn diff(from: &Value, to: &Value) -> super::Patch {
|
||||
let mut differ = PatchDiffer::new();
|
||||
diff_mut(&mut differ, from, to);
|
||||
differ.patch
|
||||
}
|
||||
|
||||
fn diff_mut(differ: &mut PatchDiffer, from: &Value, to: &Value) {
|
||||
match (from, to) {
|
||||
(Value::Object(f), Value::Object(t)) if !f.ptr_eq(t) => {
|
||||
for key in f
|
||||
.keys()
|
||||
.chain(t.keys())
|
||||
.map(|k| &**k)
|
||||
.collect::<BTreeSet<_>>()
|
||||
{
|
||||
differ.path.push_end(key);
|
||||
match (f.get(key), to.get(key)) {
|
||||
(Some(f), Some(t)) if f != t => {
|
||||
diff_mut(differ, f, t);
|
||||
}
|
||||
(Some(_), None) => {
|
||||
differ.patch.0.push(PatchOperation::Remove(RemoveOperation {
|
||||
path: differ.path.clone(),
|
||||
}));
|
||||
}
|
||||
(None, Some(t)) => {
|
||||
differ.patch.0.push(PatchOperation::Add(AddOperation {
|
||||
path: differ.path.clone(),
|
||||
value: t.clone(),
|
||||
}));
|
||||
}
|
||||
_ => (),
|
||||
}
|
||||
differ.path.pop_end();
|
||||
}
|
||||
}
|
||||
(Value::Array(f), Value::Array(t)) if !f.ptr_eq(t) => {
|
||||
if f.len() < t.len() {
|
||||
let mut f_idx = 0;
|
||||
let mut t_idx = 0;
|
||||
while t_idx < t.len() {
|
||||
if f_idx == f.len() {
|
||||
differ.patch.0.push(PatchOperation::Add(AddOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
value: t[t_idx].clone(),
|
||||
}));
|
||||
t_idx += 1;
|
||||
} else {
|
||||
if !f[f_idx].ptr_eq(&t[t_idx]) {
|
||||
if t.iter().skip(t_idx + 1).any(|t| f[f_idx].ptr_eq(t)) {
|
||||
differ.patch.0.push(PatchOperation::Add(AddOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
value: t[t_idx].clone(),
|
||||
}));
|
||||
t_idx += 1;
|
||||
continue;
|
||||
} else {
|
||||
differ.path.push_end_idx(t_idx);
|
||||
diff_mut(differ, &f[f_idx], &t[t_idx]);
|
||||
differ.path.pop_end();
|
||||
}
|
||||
}
|
||||
f_idx += 1;
|
||||
t_idx += 1;
|
||||
}
|
||||
}
|
||||
while f_idx < f.len() {
|
||||
differ.patch.0.push(PatchOperation::Remove(RemoveOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
}));
|
||||
f_idx += 1;
|
||||
}
|
||||
} else if f.len() > t.len() {
|
||||
let mut f_idx = 0;
|
||||
let mut t_idx = 0;
|
||||
while f_idx < f.len() {
|
||||
if t_idx == t.len() {
|
||||
differ.patch.0.push(PatchOperation::Remove(RemoveOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
}));
|
||||
f_idx += 1;
|
||||
} else {
|
||||
if !f[f_idx].ptr_eq(&t[t_idx]) {
|
||||
if f.iter().skip(f_idx + 1).any(|f| t[t_idx].ptr_eq(f)) {
|
||||
differ.patch.0.push(PatchOperation::Remove(RemoveOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
}));
|
||||
f_idx += 1;
|
||||
continue;
|
||||
} else {
|
||||
differ.path.push_end_idx(t_idx);
|
||||
diff_mut(differ, &f[f_idx], &t[t_idx]);
|
||||
differ.path.pop_end();
|
||||
}
|
||||
}
|
||||
f_idx += 1;
|
||||
t_idx += 1;
|
||||
}
|
||||
}
|
||||
while t_idx < t.len() {
|
||||
differ.patch.0.push(PatchOperation::Add(AddOperation {
|
||||
path: differ.path.clone().join_end_idx(t_idx),
|
||||
value: t[t_idx].clone(),
|
||||
}));
|
||||
t_idx += 1;
|
||||
}
|
||||
} else {
|
||||
for i in 0..f.len() {
|
||||
if !f[i].ptr_eq(&t[i]) {
|
||||
differ.path.push_end_idx(i);
|
||||
diff_mut(differ, &f[i], &t[i]);
|
||||
differ.path.pop_end();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
(f, t) if f != t => differ
|
||||
.patch
|
||||
.0
|
||||
.push(PatchOperation::Replace(ReplaceOperation {
|
||||
path: differ.path.clone(),
|
||||
value: t.clone(),
|
||||
})),
|
||||
_ => (),
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use imbl_value::Value;
|
||||
|
||||
#[test]
|
||||
pub fn replace_all() {
|
||||
let left = json!({"title": "Hello!"});
|
||||
let p = super::diff(&left, &Value::Null);
|
||||
assert_eq!(
|
||||
p,
|
||||
imbl_value::from_value(json!([
|
||||
{ "op": "replace", "path": "", "value": null },
|
||||
]))
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
pub fn add_all() {
|
||||
let right = json!({"title": "Hello!"});
|
||||
let p = super::diff(&Value::Null, &right);
|
||||
assert_eq!(
|
||||
p,
|
||||
imbl_value::from_value(json!([
|
||||
{ "op": "replace", "path": "", "value": { "title": "Hello!" } },
|
||||
]))
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
pub fn remove_all() {
|
||||
let left = json!(["hello", "bye"]);
|
||||
let right = json!([]);
|
||||
let p = super::diff(&left, &right);
|
||||
assert_eq!(
|
||||
p,
|
||||
imbl_value::from_value(json!([
|
||||
{ "op": "remove", "path": "/0" },
|
||||
{ "op": "remove", "path": "/0" },
|
||||
]))
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
pub fn remove_tail() {
|
||||
let left = json!(["hello", "bye", "hi"]);
|
||||
let right = json!(["hello"]);
|
||||
let p = super::diff(&left, &right);
|
||||
assert_eq!(
|
||||
p,
|
||||
imbl_value::from_value(json!([
|
||||
{ "op": "remove", "path": "/1" },
|
||||
{ "op": "remove", "path": "/1" },
|
||||
]))
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
#[test]
|
||||
pub fn replace_object() {
|
||||
let left = json!(["hello", "bye"]);
|
||||
let right = json!({"hello": "bye"});
|
||||
let p = super::diff(&left, &right);
|
||||
assert_eq!(
|
||||
p,
|
||||
imbl_value::from_value(json!([
|
||||
{ "op": "replace", "path": "", "value": &right },
|
||||
]))
|
||||
.unwrap()
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn escape_json_keys() {
|
||||
let mut left = json!({
|
||||
"/slashed/path": 1
|
||||
});
|
||||
let right = json!({
|
||||
"/slashed/path": 2,
|
||||
});
|
||||
let patch = super::diff(&left, &right);
|
||||
|
||||
eprintln!("{:?}", patch);
|
||||
|
||||
crate::patch(&mut left, &patch).unwrap();
|
||||
assert_eq!(left, right);
|
||||
}
|
||||
|
||||
proptest::proptest! {
|
||||
#[test]
|
||||
fn test_diff(mut from: Value, to: Value) {
|
||||
let patch = super::diff(&from, &to);
|
||||
crate::patch(&mut from, &patch).unwrap();
|
||||
assert_eq!(from, to);
|
||||
}
|
||||
}
|
||||
}
|
||||
603
json-patch/src/lib.rs
Normal file
603
json-patch/src/lib.rs
Normal file
@@ -0,0 +1,603 @@
|
||||
//! A [JSON Patch (RFC 6902)](https://tools.ietf.org/html/rfc6902) and
|
||||
//! [JSON Merge Patch (RFC 7396)](https://tools.ietf.org/html/rfc7396) implementation for Rust.
|
||||
//!
|
||||
//! # Usage
|
||||
//!
|
||||
//! Add this to your *Cargo.toml*:
|
||||
//! ```toml
|
||||
//! [dependencies]
|
||||
//! json-patch = "*"
|
||||
//! ```
|
||||
//!
|
||||
//! # Examples
|
||||
//! Create and patch document using JSON Patch:
|
||||
//!
|
||||
//! ```rust
|
||||
//! #[macro_use]
|
||||
//! extern crate imbl_value;
|
||||
//! extern crate json_patch;
|
||||
//!
|
||||
//! use json_patch::patch;
|
||||
//! use serde_json::from_str;
|
||||
//!
|
||||
//! # pub fn main() {
|
||||
//! let mut doc = json!([
|
||||
//! { "name": "Andrew" },
|
||||
//! { "name": "Maxim" }
|
||||
//! ]);
|
||||
//!
|
||||
//! let p = from_str(r#"[
|
||||
//! { "op": "test", "path": "/0/name", "value": "Andrew" },
|
||||
//! { "op": "add", "path": "/0/happy", "value": true }
|
||||
//! ]"#).unwrap();
|
||||
//!
|
||||
//! patch(&mut doc, &p).unwrap();
|
||||
//! assert_eq!(doc, json!([
|
||||
//! { "name": "Andrew", "happy": true },
|
||||
//! { "name": "Maxim" }
|
||||
//! ]));
|
||||
//!
|
||||
//! # }
|
||||
//! ```
|
||||
//!
|
||||
//! Create and patch document using JSON Merge Patch:
|
||||
//!
|
||||
//! ```rust
|
||||
//! #[macro_use]
|
||||
//! extern crate imbl_value;
|
||||
//! extern crate json_patch;
|
||||
//!
|
||||
//! use json_patch::merge;
|
||||
//!
|
||||
//! # pub fn main() {
|
||||
//! let mut doc = json!({
|
||||
//! "title": "Goodbye!",
|
||||
//! "author" : {
|
||||
//! "givenName" : "John",
|
||||
//! "familyName" : "Doe"
|
||||
//! },
|
||||
//! "tags":[ "example", "sample" ],
|
||||
//! "content": "This will be unchanged"
|
||||
//! });
|
||||
//!
|
||||
//! let patch = json!({
|
||||
//! "title": "Hello!",
|
||||
//! "phoneNumber": "+01-123-456-7890",
|
||||
//! "author": {
|
||||
//! "familyName": null
|
||||
//! },
|
||||
//! "tags": [ "example" ]
|
||||
//! });
|
||||
//!
|
||||
//! merge(&mut doc, &patch);
|
||||
//! assert_eq!(doc, json!({
|
||||
//! "title": "Hello!",
|
||||
//! "author" : {
|
||||
//! "givenName" : "John"
|
||||
//! },
|
||||
//! "tags": [ "example" ],
|
||||
//! "content": "This will be unchanged",
|
||||
//! "phoneNumber": "+01-123-456-7890"
|
||||
//! }));
|
||||
//! # }
|
||||
//! ```
|
||||
#![deny(warnings)]
|
||||
#![warn(missing_docs)]
|
||||
#[cfg_attr(test, macro_use)]
|
||||
extern crate imbl_value;
|
||||
|
||||
use imbl_value::{InOMap as Map, Value};
|
||||
use json_ptr::{JsonPointer, SegList};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::error::Error;
|
||||
use std::{fmt, mem};
|
||||
|
||||
/// Representation of JSON Patch (list of patch operations)
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct Patch(pub Vec<PatchOperation>);
|
||||
impl Patch {
|
||||
/// Prepend a path to a patch.
|
||||
/// This is useful if you run a diff on a JSON document that is a small member of a larger document
|
||||
pub fn prepend<S: AsRef<str>, V: SegList>(&mut self, ptr: &JsonPointer<S, V>) {
|
||||
for op in self.0.iter_mut() {
|
||||
match op {
|
||||
PatchOperation::Add(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
}
|
||||
PatchOperation::Remove(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
}
|
||||
PatchOperation::Replace(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
}
|
||||
PatchOperation::Move(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
op.from.prepend(ptr);
|
||||
}
|
||||
PatchOperation::Copy(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
op.from.prepend(ptr);
|
||||
}
|
||||
PatchOperation::Test(ref mut op) => {
|
||||
op.path.prepend(ptr);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
/// Checks whether or not the data at a path could be affected by a patch
|
||||
pub fn affects_path<S: AsRef<str>, V: SegList>(&self, ptr: &JsonPointer<S, V>) -> bool {
|
||||
for op in self.0.iter() {
|
||||
match op {
|
||||
PatchOperation::Add(ref op) => {
|
||||
if op.path.starts_with(ptr) || ptr.starts_with(&op.path) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
PatchOperation::Remove(ref op) => {
|
||||
if op.path.starts_with(ptr) || ptr.starts_with(&op.path) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
PatchOperation::Replace(ref op) => {
|
||||
if op.path.starts_with(ptr) || ptr.starts_with(&op.path) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
PatchOperation::Move(ref op) => {
|
||||
if op.path.starts_with(ptr)
|
||||
|| ptr.starts_with(&op.path)
|
||||
|| op.from.starts_with(ptr)
|
||||
|| ptr.starts_with(&op.from)
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
PatchOperation::Copy(ref op) => {
|
||||
if op.path.starts_with(ptr) || ptr.starts_with(&op.path) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
PatchOperation::Test(_) => {}
|
||||
}
|
||||
}
|
||||
false
|
||||
}
|
||||
/// Returns whether the patch is empty
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.0.is_empty()
|
||||
}
|
||||
}
|
||||
|
||||
/// JSON Patch 'add' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct AddOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
/// Value to add to the target location.
|
||||
pub value: Value,
|
||||
}
|
||||
|
||||
/// JSON Patch 'remove' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct RemoveOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
}
|
||||
|
||||
/// JSON Patch 'replace' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct ReplaceOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
/// Value to replace with.
|
||||
pub value: Value,
|
||||
}
|
||||
|
||||
/// JSON Patch 'move' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct MoveOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// to move value from.
|
||||
pub from: JsonPointer<String>,
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
}
|
||||
|
||||
/// JSON Patch 'copy' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct CopyOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// to copy value from.
|
||||
pub from: JsonPointer<String>,
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
}
|
||||
|
||||
/// JSON Patch 'test' operation representation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
pub struct TestOperation {
|
||||
/// JSON-Pointer value [RFC6901](https://tools.ietf.org/html/rfc6901) that references a location
|
||||
/// within the target document where the operation is performed.
|
||||
pub path: JsonPointer<String>,
|
||||
/// Value to test against.
|
||||
pub value: Value,
|
||||
}
|
||||
|
||||
/// JSON Patch single patch operation
|
||||
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq)]
|
||||
#[serde(tag = "op")]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum PatchOperation {
|
||||
/// 'add' operation
|
||||
Add(AddOperation),
|
||||
/// 'remove' operation
|
||||
Remove(RemoveOperation),
|
||||
/// 'replace' operation
|
||||
Replace(ReplaceOperation),
|
||||
/// 'move' operation
|
||||
Move(MoveOperation),
|
||||
/// 'copy' operation
|
||||
Copy(CopyOperation),
|
||||
/// 'test' operation
|
||||
Test(TestOperation),
|
||||
}
|
||||
|
||||
/// This type represents all possible errors that can occur when applying JSON patch
|
||||
#[derive(Debug)]
|
||||
pub enum PatchError {
|
||||
/// One of the pointers in the patch is invalid
|
||||
InvalidPointer,
|
||||
|
||||
/// 'test' operation failed
|
||||
TestFailed,
|
||||
}
|
||||
|
||||
impl Error for PatchError {}
|
||||
|
||||
impl fmt::Display for PatchError {
|
||||
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
|
||||
match *self {
|
||||
PatchError::InvalidPointer => write!(fmt, "invalid pointer"),
|
||||
PatchError::TestFailed => write!(fmt, "test failed"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn add<S: AsRef<str>, V: SegList>(
|
||||
doc: &mut Value,
|
||||
path: &JsonPointer<S, V>,
|
||||
value: Value,
|
||||
) -> Result<Option<Value>, PatchError> {
|
||||
path.insert(doc, value, false)
|
||||
.map_err(|_| PatchError::InvalidPointer)
|
||||
}
|
||||
|
||||
fn remove<S: AsRef<str>, V: SegList>(
|
||||
doc: &mut Value,
|
||||
path: &JsonPointer<S, V>,
|
||||
allow_last: bool,
|
||||
) -> Result<Value, PatchError> {
|
||||
path.remove(doc, allow_last)
|
||||
.ok_or(PatchError::InvalidPointer)
|
||||
}
|
||||
|
||||
fn replace<S: AsRef<str>, V: SegList>(
|
||||
doc: &mut Value,
|
||||
path: &JsonPointer<S, V>,
|
||||
value: Value,
|
||||
) -> Result<Value, PatchError> {
|
||||
if let Some(target) = path.get_mut(doc) {
|
||||
Ok(mem::replace(target, value))
|
||||
} else {
|
||||
Ok(add(doc, path, value)?.unwrap_or_default())
|
||||
}
|
||||
}
|
||||
|
||||
fn mov<S0: AsRef<str>, S1: AsRef<str>>(
|
||||
doc: &mut Value,
|
||||
from: &JsonPointer<S0>,
|
||||
path: &JsonPointer<S1>,
|
||||
allow_last: bool,
|
||||
) -> Result<Option<Value>, PatchError> {
|
||||
if path == from {
|
||||
return Ok(None);
|
||||
}
|
||||
// Check we are not moving inside own child
|
||||
if path.starts_with(from) || from.is_empty() {
|
||||
return Err(PatchError::InvalidPointer);
|
||||
}
|
||||
let val = remove(doc, from, allow_last)?;
|
||||
add(doc, path, val)
|
||||
}
|
||||
|
||||
fn copy<S0: AsRef<str>, S1: AsRef<str>>(
|
||||
doc: &mut Value,
|
||||
from: &JsonPointer<S0>,
|
||||
path: &JsonPointer<S1>,
|
||||
) -> Result<Option<Value>, PatchError> {
|
||||
let source = from.get(doc).ok_or(PatchError::InvalidPointer)?.clone();
|
||||
add(doc, path, source)
|
||||
}
|
||||
|
||||
fn test<S: AsRef<str>, V: SegList>(
|
||||
doc: &Value,
|
||||
path: &JsonPointer<S, V>,
|
||||
expected: &Value,
|
||||
) -> Result<(), PatchError> {
|
||||
let target = path.get(doc).ok_or(PatchError::InvalidPointer)?;
|
||||
if *target == *expected {
|
||||
Ok(())
|
||||
} else {
|
||||
Err(PatchError::TestFailed)
|
||||
}
|
||||
}
|
||||
|
||||
/// Create JSON Patch from JSON Value
|
||||
/// # Examples
|
||||
///
|
||||
/// Create patch from `imbl_value::Value`:
|
||||
///
|
||||
/// ```rust
|
||||
/// #[macro_use]
|
||||
/// extern crate imbl_value;
|
||||
/// extern crate json_patch;
|
||||
///
|
||||
/// use json_patch::{Patch, from_value};
|
||||
///
|
||||
/// # pub fn main() {
|
||||
/// let patch_value = json!([
|
||||
/// { "op": "test", "path": "/0/name", "value": "Andrew" },
|
||||
/// { "op": "add", "path": "/0/happy", "value": true }
|
||||
/// ]);
|
||||
/// let patch: Patch = from_value(patch_value).unwrap();
|
||||
/// # }
|
||||
/// ```
|
||||
///
|
||||
/// Create patch from string:
|
||||
///
|
||||
/// ```rust
|
||||
/// #[macro_use]
|
||||
/// extern crate serde_json;
|
||||
/// extern crate json_patch;
|
||||
///
|
||||
/// use json_patch::Patch;
|
||||
/// use serde_json::from_str;
|
||||
///
|
||||
/// # pub fn main() {
|
||||
/// let patch_str = r#"[
|
||||
/// { "op": "test", "path": "/0/name", "value": "Andrew" },
|
||||
/// { "op": "add", "path": "/0/happy", "value": true }
|
||||
/// ]"#;
|
||||
/// let patch: Patch = from_str(patch_str).unwrap();
|
||||
/// # }
|
||||
/// ```
|
||||
pub fn from_value(value: Value) -> Result<Patch, imbl_value::Error> {
|
||||
let patch = imbl_value::from_value::<Vec<PatchOperation>>(value)?;
|
||||
Ok(Patch(patch))
|
||||
}
|
||||
|
||||
/// Patch provided JSON document (given as `imbl_value::Value`) in-place. If any of the patch is
|
||||
/// failed, all previous operations are reverted. In case of internal error resulting in panic,
|
||||
/// document might be left in inconsistent state.
|
||||
///
|
||||
/// # Example
|
||||
/// Create and patch document:
|
||||
///
|
||||
/// ```rust
|
||||
/// #[macro_use]
|
||||
/// extern crate imbl_value;
|
||||
/// extern crate json_patch;
|
||||
///
|
||||
/// use json_patch::patch;
|
||||
/// use serde_json::from_str;
|
||||
///
|
||||
/// # pub fn main() {
|
||||
/// let mut doc = json!([
|
||||
/// { "name": "Andrew" },
|
||||
/// { "name": "Maxim" }
|
||||
/// ]);
|
||||
///
|
||||
/// let p = from_str(r#"[
|
||||
/// { "op": "test", "path": "/0/name", "value": "Andrew" },
|
||||
/// { "op": "add", "path": "/0/happy", "value": true }
|
||||
/// ]"#).unwrap();
|
||||
///
|
||||
/// patch(&mut doc, &p).unwrap();
|
||||
/// assert_eq!(doc, json!([
|
||||
/// { "name": "Andrew", "happy": true },
|
||||
/// { "name": "Maxim" }
|
||||
/// ]));
|
||||
///
|
||||
/// # }
|
||||
/// ```
|
||||
pub fn patch<'a>(doc: &mut Value, patch: &'a Patch) -> Result<Undo<'a>, PatchError> {
|
||||
let mut undo = Undo(Vec::with_capacity(patch.0.len()));
|
||||
apply_patches(doc, &patch.0, &mut undo).map(|_| undo)
|
||||
}
|
||||
|
||||
/// Object that can be used to undo a patch if successful
|
||||
pub struct Undo<'a>(Vec<Box<dyn FnOnce(&mut Value) + Send + Sync + 'a>>);
|
||||
impl<'a> Undo<'a> {
|
||||
/// Apply the undo to the document
|
||||
pub fn apply(mut self, doc: &mut Value) {
|
||||
while let Some(undo) = self.0.pop() {
|
||||
undo(doc)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Apply patches while tracking all the changes being made so they can be reverted back in case
|
||||
// subsequent patches fail. Uses heap allocated closures to keep the state.
|
||||
fn apply_patches<'a>(
|
||||
doc: &mut Value,
|
||||
patches: &'a [PatchOperation],
|
||||
undo: &mut Undo<'a>,
|
||||
) -> Result<(), PatchError> {
|
||||
let (patch, tail) = match patches.split_first() {
|
||||
None => return Ok(()),
|
||||
Some((patch, tail)) => (patch, tail),
|
||||
};
|
||||
|
||||
let res = match *patch {
|
||||
PatchOperation::Add(ref op) => {
|
||||
let prev = add(doc, &op.path, op.value.clone())?;
|
||||
undo.0.push(Box::new(move |doc| {
|
||||
match prev {
|
||||
None => remove(doc, &op.path, true).unwrap(),
|
||||
Some(v) => add(doc, &op.path, v).unwrap().unwrap(),
|
||||
};
|
||||
}));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
PatchOperation::Remove(ref op) => {
|
||||
let prev = remove(doc, &op.path, false)?;
|
||||
undo.0.push(Box::new(move |doc| {
|
||||
assert!(add(doc, &op.path, prev).unwrap().is_none());
|
||||
}));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
PatchOperation::Replace(ref op) => {
|
||||
let prev = replace(doc, &op.path, op.value.clone())?;
|
||||
undo.0.push(Box::new(move |doc| {
|
||||
replace(doc, &op.path, prev).unwrap();
|
||||
}));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
PatchOperation::Move(ref op) => {
|
||||
let prev = mov(doc, &op.from, &op.path, false)?;
|
||||
undo.0.push(Box::new(move |doc| {
|
||||
mov(doc, &op.path, &op.from, true).unwrap();
|
||||
if let Some(prev) = prev {
|
||||
assert!(add(doc, &op.path, prev).unwrap().is_none());
|
||||
}
|
||||
}));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
PatchOperation::Copy(ref op) => {
|
||||
let prev = copy(doc, &op.from, &op.path)?;
|
||||
undo.0.push(Box::new(move |doc| {
|
||||
match prev {
|
||||
None => remove(doc, &op.path, true).unwrap(),
|
||||
Some(v) => add(doc, &op.path, v).unwrap().unwrap(),
|
||||
};
|
||||
}));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
PatchOperation::Test(ref op) => {
|
||||
test(doc, &op.path, &op.value)?;
|
||||
undo.0.push(Box::new(move |_| ()));
|
||||
apply_patches(doc, tail, undo)
|
||||
}
|
||||
};
|
||||
if res.is_err() {
|
||||
undo.0.pop().unwrap()(doc);
|
||||
}
|
||||
res
|
||||
}
|
||||
|
||||
/// Patch provided JSON document (given as `imbl_value::Value`) in place.
|
||||
/// Operations are applied in unsafe manner. If any of the operations fails, all previous
|
||||
/// operations are not reverted.
|
||||
pub fn patch_unsafe(doc: &mut Value, patch: &Patch) -> Result<(), PatchError> {
|
||||
for op in &patch.0 {
|
||||
match *op {
|
||||
PatchOperation::Add(ref op) => {
|
||||
add(doc, &op.path, op.value.clone())?;
|
||||
}
|
||||
PatchOperation::Remove(ref op) => {
|
||||
remove(doc, &op.path, false)?;
|
||||
}
|
||||
PatchOperation::Replace(ref op) => {
|
||||
replace(doc, &op.path, op.value.clone())?;
|
||||
}
|
||||
PatchOperation::Move(ref op) => {
|
||||
mov(doc, &op.from, &op.path, false)?;
|
||||
}
|
||||
PatchOperation::Copy(ref op) => {
|
||||
copy(doc, &op.from, &op.path)?;
|
||||
}
|
||||
PatchOperation::Test(ref op) => {
|
||||
test(doc, &op.path, &op.value)?;
|
||||
}
|
||||
};
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Patch provided JSON document (given as `imbl_value::Value`) in place with JSON Merge Patch
|
||||
/// (RFC 7396).
|
||||
///
|
||||
/// # Example
|
||||
/// Create and patch document:
|
||||
///
|
||||
/// ```rust
|
||||
/// #[macro_use]
|
||||
/// extern crate imbl_value;
|
||||
/// extern crate json_patch;
|
||||
///
|
||||
/// use json_patch::merge;
|
||||
///
|
||||
/// # pub fn main() {
|
||||
/// let mut doc = json!({
|
||||
/// "title": "Goodbye!",
|
||||
/// "author" : {
|
||||
/// "givenName" : "John",
|
||||
/// "familyName" : "Doe"
|
||||
/// },
|
||||
/// "tags":[ "example", "sample" ],
|
||||
/// "content": "This will be unchanged"
|
||||
/// });
|
||||
///
|
||||
/// let patch = json!({
|
||||
/// "title": "Hello!",
|
||||
/// "phoneNumber": "+01-123-456-7890",
|
||||
/// "author": {
|
||||
/// "familyName": null
|
||||
/// },
|
||||
/// "tags": [ "example" ]
|
||||
/// });
|
||||
///
|
||||
/// merge(&mut doc, &patch);
|
||||
/// assert_eq!(doc, json!({
|
||||
/// "title": "Hello!",
|
||||
/// "author" : {
|
||||
/// "givenName" : "John"
|
||||
/// },
|
||||
/// "tags": [ "example" ],
|
||||
/// "content": "This will be unchanged",
|
||||
/// "phoneNumber": "+01-123-456-7890"
|
||||
/// }));
|
||||
/// # }
|
||||
/// ```
|
||||
pub fn merge(doc: &mut Value, patch: &Value) {
|
||||
if !patch.is_object() {
|
||||
*doc = patch.clone();
|
||||
return;
|
||||
}
|
||||
|
||||
if !doc.is_object() {
|
||||
*doc = Value::Object(Map::new());
|
||||
}
|
||||
let map = doc.as_object_mut().unwrap();
|
||||
for (key, value) in patch.as_object().unwrap() {
|
||||
if value.is_null() {
|
||||
map.remove(&*key);
|
||||
} else {
|
||||
merge(map.entry(key.clone()).or_insert(Value::Null), value);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "diff")]
|
||||
mod diff;
|
||||
|
||||
#[cfg(feature = "diff")]
|
||||
pub use self::diff::diff;
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests;
|
||||
83
json-patch/src/tests/mod.rs
Normal file
83
json-patch/src/tests/mod.rs
Normal file
@@ -0,0 +1,83 @@
|
||||
#![allow(unused)]
|
||||
extern crate rand;
|
||||
|
||||
mod util;
|
||||
|
||||
use super::*;
|
||||
use serde_json::from_str;
|
||||
|
||||
#[test]
|
||||
fn parse_from_value() {
|
||||
use PatchOperation::*;
|
||||
|
||||
let json = json!([{"op": "add", "path": "/a/b", "value": 1}, {"op": "remove", "path": "/c"}]);
|
||||
let patch: Patch = from_value(json).unwrap();
|
||||
|
||||
assert_eq!(
|
||||
patch,
|
||||
Patch(vec![
|
||||
Add(AddOperation {
|
||||
path: "/a/b".parse().unwrap(),
|
||||
value: json!(1),
|
||||
}),
|
||||
Remove(RemoveOperation {
|
||||
path: "/c".parse().unwrap(),
|
||||
}),
|
||||
])
|
||||
);
|
||||
|
||||
let _patch: Patch =
|
||||
from_str(r#"[{"op": "add", "path": "/a/b", "value": 1}, {"op": "remove", "path": "/c"}]"#)
|
||||
.unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn parse_from_string() {
|
||||
use PatchOperation::*;
|
||||
|
||||
let patch: Patch =
|
||||
from_str(r#"[{"op": "add", "path": "/a/b", "value": 1}, {"op": "remove", "path": "/c"}]"#)
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(
|
||||
patch,
|
||||
Patch(vec![
|
||||
Add(AddOperation {
|
||||
path: "/a/b".parse().unwrap(),
|
||||
value: json!(1),
|
||||
}),
|
||||
Remove(RemoveOperation {
|
||||
path: "/c".parse().unwrap()
|
||||
}),
|
||||
])
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serialize_patch() {
|
||||
let s = r#"[{"op":"add","path":"/a/b","value":1},{"op":"remove","path":"/c"}]"#;
|
||||
let patch: Patch = from_str(s).unwrap();
|
||||
|
||||
let serialized = serde_json::to_string(&patch).unwrap();
|
||||
assert_eq!(serialized, s);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tests() {
|
||||
util::run_specs("specs/tests.json");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn spec_tests() {
|
||||
util::run_specs("specs/spec_tests.json");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn revert_tests() {
|
||||
util::run_specs("specs/revert_tests.json");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn merge_tests() {
|
||||
util::run_specs("specs/merge_tests.json");
|
||||
}
|
||||
103
json-patch/src/tests/util.rs
Normal file
103
json-patch/src/tests/util.rs
Normal file
@@ -0,0 +1,103 @@
|
||||
use imbl_value::Value;
|
||||
use serde::Deserialize;
|
||||
use std::fmt::Write;
|
||||
use std::{fs, io};
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TestCase {
|
||||
comment: Option<String>,
|
||||
doc: Value,
|
||||
patch: Value,
|
||||
expected: Option<Value>,
|
||||
error: Option<String>,
|
||||
#[serde(default)]
|
||||
disabled: bool,
|
||||
#[serde(default)]
|
||||
merge: bool,
|
||||
}
|
||||
|
||||
fn run_case(doc: &Value, patches: &Value, merge_patch: bool) -> Result<Value, String> {
|
||||
let mut actual = doc.clone();
|
||||
if merge_patch {
|
||||
crate::merge(&mut actual, &patches);
|
||||
} else {
|
||||
let patches: crate::Patch =
|
||||
imbl_value::from_value(patches.clone()).map_err(|e| e.to_string())?;
|
||||
|
||||
// Patch and verify that in case of error document wasn't changed
|
||||
crate::patch(&mut actual, &patches)
|
||||
.map_err(|e| {
|
||||
assert_eq!(
|
||||
*doc, actual,
|
||||
"no changes should be made to the original document"
|
||||
);
|
||||
e
|
||||
})
|
||||
.map_err(|e| e.to_string())?;
|
||||
}
|
||||
Ok(actual)
|
||||
}
|
||||
|
||||
fn run_case_patch_unsafe(doc: &Value, patches: &Value) -> Result<Value, String> {
|
||||
let mut actual = doc.clone();
|
||||
let patches: crate::Patch =
|
||||
imbl_value::from_value(patches.clone()).map_err(|e| e.to_string())?;
|
||||
crate::patch_unsafe(&mut actual, &patches).map_err(|e| e.to_string())?;
|
||||
Ok(actual)
|
||||
}
|
||||
|
||||
pub fn run_specs(path: &str) {
|
||||
let file = fs::File::open(path).unwrap();
|
||||
let buf = io::BufReader::new(file);
|
||||
let cases: Vec<TestCase> = serde_json::from_reader(buf).unwrap();
|
||||
|
||||
for (idx, tc) in cases.into_iter().enumerate() {
|
||||
print!("Running test case {}", idx);
|
||||
if let Some(comment) = tc.comment {
|
||||
print!(" ({})... ", comment);
|
||||
} else {
|
||||
print!("... ");
|
||||
}
|
||||
|
||||
if tc.disabled {
|
||||
println!("disabled...");
|
||||
continue;
|
||||
}
|
||||
|
||||
match run_case(&tc.doc, &tc.patch, tc.merge) {
|
||||
Ok(actual) => {
|
||||
if let Some(ref error) = tc.error {
|
||||
println!("expected to fail with '{}'", error);
|
||||
panic!("expected to fail, got document {:?}", actual);
|
||||
}
|
||||
println!();
|
||||
if let Some(ref expected) = tc.expected {
|
||||
assert_eq!(*expected, actual);
|
||||
}
|
||||
}
|
||||
Err(err) => {
|
||||
println!("failed with '{}'", err);
|
||||
tc.error.as_ref().expect("patch expected to succeed");
|
||||
}
|
||||
}
|
||||
|
||||
if !tc.merge {
|
||||
match run_case_patch_unsafe(&tc.doc, &tc.patch) {
|
||||
Ok(actual) => {
|
||||
if let Some(ref error) = tc.error {
|
||||
println!("expected to fail with '{}'", error);
|
||||
panic!("expected to fail, got document {:?}", actual);
|
||||
}
|
||||
println!();
|
||||
if let Some(ref expected) = tc.expected {
|
||||
assert_eq!(*expected, actual);
|
||||
}
|
||||
}
|
||||
Err(err) => {
|
||||
println!("failed with '{}'", err);
|
||||
tc.error.as_ref().expect("patch expected to succeed");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
1
json-ptr
1
json-ptr
Submodule json-ptr deleted from db58032732
2
json-ptr/.gitignore
vendored
Normal file
2
json-ptr/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
/target
|
||||
Cargo.lock
|
||||
18
json-ptr/Cargo.toml
Normal file
18
json-ptr/Cargo.toml
Normal file
@@ -0,0 +1,18 @@
|
||||
[package]
|
||||
name = "json-ptr"
|
||||
version = "0.1.0"
|
||||
authors = ["Aiden McClelland <me@drbonez.dev>"]
|
||||
edition = "2021"
|
||||
keywords = ["json", "json-pointer"]
|
||||
description = "RFC 6901, JavaScript Object Notation (JSON) Pointer"
|
||||
repository = "https://github.com/dr-bonez/json-ptr"
|
||||
license = "MIT"
|
||||
readme = "README.md"
|
||||
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
imbl = "6"
|
||||
imbl-value = "0.4.1"
|
||||
serde = "1"
|
||||
thiserror = "2"
|
||||
969
json-ptr/src/lib.rs
Normal file
969
json-ptr/src/lib.rs
Normal file
@@ -0,0 +1,969 @@
|
||||
use std::borrow::Cow;
|
||||
use std::cmp::Ordering;
|
||||
use std::collections::VecDeque;
|
||||
use std::hash::{Hash, Hasher};
|
||||
use std::ops::{Add, AddAssign, Bound, Range, RangeBounds};
|
||||
use std::str::FromStr;
|
||||
|
||||
use imbl::Vector;
|
||||
use imbl_value::{InOMap, InternedString, Value};
|
||||
use thiserror::Error;
|
||||
|
||||
pub type JsonPointerRef<'a> = JsonPointer<&'a str, BorrowedSegList<'a>>;
|
||||
|
||||
pub const ROOT: JsonPointerRef = JsonPointer {
|
||||
src: "",
|
||||
offset: 0,
|
||||
segments: (&[], &[]),
|
||||
};
|
||||
|
||||
#[derive(Clone, Debug, Error)]
|
||||
pub enum ParseError {
|
||||
#[error("Invalid Escape: ~{0}")]
|
||||
InvalidEscape(char),
|
||||
#[error("Missing Leading \"/\"")]
|
||||
NoLeadingSlash,
|
||||
}
|
||||
#[derive(Clone, Debug, Error)]
|
||||
pub enum IndexError {
|
||||
#[error("Could Not Index Into {0}")]
|
||||
CouldNotIndexInto(&'static str),
|
||||
#[error("Index Out Of Bounds: {0}")]
|
||||
IndexOutOfBounds(usize),
|
||||
#[error("Invalid Array Index: {0}")]
|
||||
InvalidArrayIndex(#[from] std::num::ParseIntError),
|
||||
#[error("Array Index Leading Zero")]
|
||||
ArrayIndexLeadingZero,
|
||||
}
|
||||
|
||||
fn parse_idx(idx: &str) -> Result<usize, IndexError> {
|
||||
if idx.len() > 1 && idx.starts_with("0") {
|
||||
return Err(IndexError::ArrayIndexLeadingZero);
|
||||
}
|
||||
Ok(idx.parse()?)
|
||||
}
|
||||
|
||||
pub type BorrowedSegList<'a> = (&'a [PtrSegment], &'a [PtrSegment]);
|
||||
|
||||
pub trait SegList: Sized {
|
||||
fn as_slices(&self) -> BorrowedSegList<'_>;
|
||||
fn get(&self, mut idx: usize) -> Option<&PtrSegment> {
|
||||
let slices = self.as_slices();
|
||||
for slice in [slices.0, slices.1] {
|
||||
if let Some(seg) = slice.get(idx) {
|
||||
return Some(seg);
|
||||
} else {
|
||||
idx -= slice.len();
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
fn first(&self) -> Option<&PtrSegment> {
|
||||
let slices = self.as_slices();
|
||||
for slice in [slices.0, slices.1] {
|
||||
if let Some(seg) = slice.first() {
|
||||
return Some(seg);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
fn last(&self) -> Option<&PtrSegment> {
|
||||
let slices = self.as_slices();
|
||||
for slice in [slices.0, slices.1].into_iter().rev() {
|
||||
if let Some(seg) = slice.last() {
|
||||
return Some(seg);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
fn len(&self) -> usize {
|
||||
let slices = self.as_slices();
|
||||
[slices.0, slices.1]
|
||||
.into_iter()
|
||||
.fold(0, |acc, x| acc + x.len())
|
||||
}
|
||||
fn slice<R: RangeBounds<usize>>(&self, range: R) -> Option<BorrowedSegList<'_>> {
|
||||
let start_idx = match range.start_bound() {
|
||||
Bound::Unbounded => 0,
|
||||
Bound::Included(n) => *n,
|
||||
Bound::Excluded(n) => n + 1,
|
||||
};
|
||||
let end_idx = match range.end_bound() {
|
||||
Bound::Unbounded => self.len(),
|
||||
Bound::Included(n) => n + 1,
|
||||
Bound::Excluded(n) => *n,
|
||||
};
|
||||
let (left, right) = self.as_slices();
|
||||
if start_idx <= left.len() {
|
||||
if end_idx <= left.len() {
|
||||
Some((&left[start_idx..end_idx], &[]))
|
||||
} else if end_idx - left.len() <= right.len() {
|
||||
Some((&left[start_idx..], &right[..end_idx - left.len()]))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else if start_idx - left.len() < right.len() && end_idx - left.len() <= right.len() {
|
||||
Some((&[], &right[start_idx - left.len()..end_idx - left.len()]))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
fn to_vec_deque(self) -> VecDeque<PtrSegment> {
|
||||
let slices = self.as_slices();
|
||||
let mut res = VecDeque::with_capacity(self.len());
|
||||
res.extend([slices.0, slices.1].into_iter().flatten().cloned());
|
||||
res
|
||||
}
|
||||
}
|
||||
|
||||
impl SegList for VecDeque<PtrSegment> {
|
||||
fn as_slices(&self) -> BorrowedSegList<'_> {
|
||||
self.as_slices()
|
||||
}
|
||||
fn to_vec_deque(self) -> VecDeque<PtrSegment> {
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a> SegList for BorrowedSegList<'a> {
|
||||
fn as_slices(&self) -> BorrowedSegList<'_> {
|
||||
(self.0, self.1)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, Default)]
|
||||
pub struct JsonPointer<S: AsRef<str> = String, V: SegList = VecDeque<PtrSegment>> {
|
||||
src: S,
|
||||
offset: usize,
|
||||
segments: V,
|
||||
}
|
||||
impl<'a, S: AsRef<str>, V: SegList> From<&'a JsonPointer<S, V>> for JsonPointerRef<'a> {
|
||||
fn from(value: &'a JsonPointer<S, V>) -> Self {
|
||||
value.borrowed()
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> Eq for JsonPointer<S, V> {}
|
||||
impl<S: AsRef<str>, V: SegList> PartialOrd for JsonPointer<S, V> {
|
||||
fn partial_cmp(&self, other: &JsonPointer<S, V>) -> Option<Ordering> {
|
||||
Some(self.cmp(other))
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> Ord for JsonPointer<S, V> {
|
||||
fn cmp(&self, other: &JsonPointer<S, V>) -> Ordering {
|
||||
let mut a = self.iter();
|
||||
let mut b = other.iter();
|
||||
loop {
|
||||
let a_head = a.next();
|
||||
let b_head = b.next();
|
||||
match (a_head, b_head) {
|
||||
(None, None) => {
|
||||
return Ordering::Equal;
|
||||
}
|
||||
(None, Some(_)) => {
|
||||
return Ordering::Less;
|
||||
}
|
||||
(Some(_), None) => {
|
||||
return Ordering::Greater;
|
||||
}
|
||||
(Some(p), Some(q)) => match p.cmp(q) {
|
||||
Ordering::Equal => {
|
||||
continue;
|
||||
}
|
||||
ne => {
|
||||
return ne;
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>> JsonPointer<S> {
|
||||
pub fn parse(s: S) -> Result<Self, ParseError> {
|
||||
let src = s.as_ref();
|
||||
if src == "" {
|
||||
return Ok(JsonPointer {
|
||||
src: s,
|
||||
offset: 0,
|
||||
segments: VecDeque::new(),
|
||||
});
|
||||
}
|
||||
let mut segments = VecDeque::new();
|
||||
let mut segment = PtrSegment::Unescaped(1..1);
|
||||
let mut escape_next_char = false;
|
||||
for (idx, c) in src.char_indices() {
|
||||
if idx == 0 {
|
||||
if c == '/' {
|
||||
continue;
|
||||
} else {
|
||||
return Err(ParseError::NoLeadingSlash);
|
||||
}
|
||||
}
|
||||
if escape_next_char {
|
||||
match c {
|
||||
'0' => {
|
||||
segment = match segment {
|
||||
PtrSegment::Unescaped(range) => PtrSegment::Escaped(
|
||||
range.start..idx + 1,
|
||||
src[range].to_owned() + "~",
|
||||
),
|
||||
PtrSegment::Escaped(range, s) => {
|
||||
PtrSegment::Escaped(range.start..idx + 1, s + "~")
|
||||
}
|
||||
}
|
||||
}
|
||||
'1' => {
|
||||
segment = match segment {
|
||||
PtrSegment::Unescaped(range) => PtrSegment::Escaped(
|
||||
range.start..idx + 1,
|
||||
src[range].to_owned() + "/",
|
||||
),
|
||||
PtrSegment::Escaped(range, s) => {
|
||||
PtrSegment::Escaped(range.start..idx + 1, s + "/")
|
||||
}
|
||||
}
|
||||
}
|
||||
_ => return Err(ParseError::InvalidEscape(c)),
|
||||
}
|
||||
escape_next_char = false;
|
||||
} else {
|
||||
match c {
|
||||
'/' => {
|
||||
segments.push_back(segment);
|
||||
segment = PtrSegment::Unescaped(idx + 1..idx + 1);
|
||||
}
|
||||
'~' => {
|
||||
escape_next_char = true;
|
||||
}
|
||||
_ => match segment {
|
||||
PtrSegment::Unescaped(ref mut range) => range.end = idx + 1,
|
||||
PtrSegment::Escaped(ref mut range, ref mut s) => {
|
||||
range.end = idx + 1;
|
||||
s.push(c);
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
segments.push_back(segment);
|
||||
Ok(JsonPointer {
|
||||
src: s,
|
||||
offset: 0,
|
||||
segments,
|
||||
})
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> JsonPointer<S, V> {
|
||||
pub fn borrowed(&self) -> JsonPointerRef<'_> {
|
||||
JsonPointer {
|
||||
src: self.src.as_ref(),
|
||||
offset: self.offset,
|
||||
segments: self.segments.as_slices(),
|
||||
}
|
||||
}
|
||||
pub fn get_segment<'a>(&'a self, idx: usize) -> Option<&'a str> {
|
||||
match self.segments.get(idx) {
|
||||
Some(PtrSegment::Unescaped(range)) => {
|
||||
Some(&self.src.as_ref()[(range.start - self.offset)..(range.end - self.offset)])
|
||||
}
|
||||
Some(PtrSegment::Escaped(_, s)) => Some(&s),
|
||||
None => None,
|
||||
}
|
||||
}
|
||||
pub fn uncons<'a>(
|
||||
&'a self,
|
||||
) -> Option<(
|
||||
&'a str,
|
||||
JsonPointer<&'a str, (&'a [PtrSegment], &'a [PtrSegment])>,
|
||||
)> {
|
||||
if let (Some(s), Some(rest)) = (self.get_segment(0), self.slice(1..)) {
|
||||
Some((s, rest))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
pub fn len(&self) -> usize {
|
||||
self.segments.len()
|
||||
}
|
||||
pub fn get<'a>(&self, mut doc: &'a Value) -> Option<&'a Value> {
|
||||
for seg in self.iter() {
|
||||
doc = if doc.is_array() {
|
||||
doc.get(parse_idx(seg).ok()?)?
|
||||
} else {
|
||||
doc.get(seg)?
|
||||
};
|
||||
}
|
||||
Some(doc)
|
||||
}
|
||||
pub fn get_mut<'a>(&self, mut doc: &'a mut Value) -> Option<&'a mut Value> {
|
||||
for seg in self.iter() {
|
||||
doc = if doc.is_array() {
|
||||
doc.get_mut(parse_idx(seg).ok()?)?
|
||||
} else {
|
||||
doc.get_mut(seg)?
|
||||
};
|
||||
}
|
||||
Some(doc)
|
||||
}
|
||||
pub fn take(&self, mut doc: &mut Value) -> Option<Value> {
|
||||
for seg in self.iter() {
|
||||
doc = if doc.is_array() {
|
||||
doc.get_mut(parse_idx(seg).ok()?)?
|
||||
} else {
|
||||
doc.get_mut(seg)?
|
||||
};
|
||||
}
|
||||
Some(doc.take())
|
||||
}
|
||||
pub fn set(
|
||||
&self,
|
||||
mut doc: &mut Value,
|
||||
value: Value,
|
||||
recursive: bool,
|
||||
) -> Result<Option<Value>, IndexError> {
|
||||
for (idx, seg) in self.iter().enumerate() {
|
||||
doc = match doc {
|
||||
Value::Array(ref mut l) => {
|
||||
let num = if seg == "-" { l.len() } else { parse_idx(seg)? };
|
||||
if num == l.len() {
|
||||
if let Some(next) = self.get_segment(idx + 1) {
|
||||
if recursive {
|
||||
if next == "0" {
|
||||
l.push_back(Value::Array(Vector::new()));
|
||||
} else {
|
||||
l.push_back(Value::Object(InOMap::new()))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
l.push_back(value);
|
||||
return Ok(None);
|
||||
}
|
||||
}
|
||||
l.get_mut(num).ok_or(IndexError::IndexOutOfBounds(num))?
|
||||
}
|
||||
Value::Bool(_) => return Err(IndexError::CouldNotIndexInto("boolean")),
|
||||
Value::Null => return Err(IndexError::CouldNotIndexInto("null")),
|
||||
Value::Number(_) => return Err(IndexError::CouldNotIndexInto("number")),
|
||||
Value::Object(ref mut o) => {
|
||||
if o.get(seg).is_none() {
|
||||
if let Some(next) = self.get_segment(idx + 1) {
|
||||
if recursive {
|
||||
if next == "0" {
|
||||
o.insert(
|
||||
InternedString::intern(seg),
|
||||
Value::Array(Vector::new()),
|
||||
);
|
||||
} else {
|
||||
o.insert(
|
||||
InternedString::intern(seg),
|
||||
Value::Object(InOMap::new()),
|
||||
);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
o.insert(InternedString::intern(seg), value);
|
||||
return Ok(None);
|
||||
}
|
||||
}
|
||||
o.get_mut(seg).unwrap()
|
||||
}
|
||||
Value::String(_) => return Err(IndexError::CouldNotIndexInto("string")),
|
||||
}
|
||||
}
|
||||
Ok(Some(std::mem::replace(doc, value)))
|
||||
}
|
||||
pub fn insert(
|
||||
&self,
|
||||
mut doc: &mut Value,
|
||||
value: Value,
|
||||
recursive: bool,
|
||||
) -> Result<Option<Value>, IndexError> {
|
||||
for (idx, seg) in self.iter().enumerate() {
|
||||
doc = match doc {
|
||||
Value::Array(ref mut l) => {
|
||||
let num = if seg == "-" { l.len() } else { parse_idx(seg)? };
|
||||
if let Some(next) = self.get_segment(idx + 1) {
|
||||
if num == l.len() && recursive {
|
||||
if next == "0" {
|
||||
l.insert(num, Value::Array(Vector::new()));
|
||||
} else {
|
||||
l.insert(num, Value::Object(InOMap::new()))
|
||||
}
|
||||
}
|
||||
} else if num <= l.len() {
|
||||
l.insert(num, value);
|
||||
return Ok(None);
|
||||
}
|
||||
l.get_mut(num).ok_or(IndexError::IndexOutOfBounds(num))?
|
||||
}
|
||||
Value::Bool(_) => return Err(IndexError::CouldNotIndexInto("boolean")),
|
||||
Value::Null => return Err(IndexError::CouldNotIndexInto("null")),
|
||||
Value::Number(_) => return Err(IndexError::CouldNotIndexInto("number")),
|
||||
Value::Object(ref mut o) => {
|
||||
if o.get(seg).is_none() {
|
||||
if let Some(next) = self.get_segment(idx + 1) {
|
||||
if recursive {
|
||||
if next == "0" {
|
||||
o.insert(
|
||||
InternedString::intern(seg),
|
||||
Value::Array(Vector::new()),
|
||||
);
|
||||
} else {
|
||||
o.insert(
|
||||
InternedString::intern(seg),
|
||||
Value::Object(InOMap::new()),
|
||||
);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
o.insert(InternedString::intern(seg), value);
|
||||
return Ok(None);
|
||||
}
|
||||
}
|
||||
o.get_mut(seg)
|
||||
.ok_or(IndexError::CouldNotIndexInto("undefined"))?
|
||||
}
|
||||
Value::String(_) => return Err(IndexError::CouldNotIndexInto("string")),
|
||||
}
|
||||
}
|
||||
Ok(Some(std::mem::replace(doc, value)))
|
||||
}
|
||||
pub fn remove(&self, mut doc: &mut Value, allow_last: bool) -> Option<Value> {
|
||||
for (idx, seg) in self.iter().enumerate() {
|
||||
if self.get_segment(idx + 1).is_none() {
|
||||
match doc {
|
||||
Value::Array(ref mut l) => {
|
||||
let num = if allow_last && seg == "-" && !l.is_empty() {
|
||||
l.len() - 1
|
||||
} else {
|
||||
parse_idx(seg).ok()?
|
||||
};
|
||||
if num < l.len() {
|
||||
return Some(l.remove(num));
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
Value::Object(ref mut o) => {
|
||||
return o.remove(seg);
|
||||
}
|
||||
_ => return None,
|
||||
}
|
||||
} else {
|
||||
doc = match doc {
|
||||
Value::Array(ref mut arr) => {
|
||||
if allow_last && seg == "-" && !arr.is_empty() {
|
||||
let arr_len = arr.len();
|
||||
arr.get_mut(arr_len - 1)?
|
||||
} else {
|
||||
arr.get_mut(parse_idx(seg).ok()?)?
|
||||
}
|
||||
}
|
||||
Value::Object(ref mut o) => o.get_mut(seg)?,
|
||||
_ => return None,
|
||||
};
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.segments.len() == 0
|
||||
}
|
||||
pub fn to_owned(self) -> JsonPointer {
|
||||
JsonPointer {
|
||||
src: self.src.as_ref().to_owned(),
|
||||
offset: self.offset,
|
||||
segments: self.segments.to_vec_deque(),
|
||||
}
|
||||
}
|
||||
pub fn common_prefix<'a, S0: AsRef<str>, V0: SegList>(
|
||||
&'a self,
|
||||
other: &JsonPointer<S0, V0>,
|
||||
) -> JsonPointer<&'a str, (&'a [PtrSegment], &'a [PtrSegment])> {
|
||||
let mut common = None;
|
||||
for (idx, seg) in self.iter().enumerate() {
|
||||
if Some(seg) != other.get_segment(idx) {
|
||||
break;
|
||||
}
|
||||
common = Some(idx);
|
||||
}
|
||||
let common_idx = if let Some(common) = common {
|
||||
self.segments
|
||||
.get(common)
|
||||
.map(PtrSegment::range)
|
||||
.map(|r| r.end - self.offset)
|
||||
.unwrap_or(0)
|
||||
} else {
|
||||
0
|
||||
};
|
||||
let src = self.src.as_ref();
|
||||
JsonPointer {
|
||||
src: &src[0..common_idx],
|
||||
offset: self.offset,
|
||||
segments: self
|
||||
.segments
|
||||
.slice(0..(common.map(|a| a + 1).unwrap_or(0)))
|
||||
.unwrap(),
|
||||
}
|
||||
}
|
||||
pub fn starts_with<S0: AsRef<str>, V0: SegList>(&self, other: &JsonPointer<S0, V0>) -> bool {
|
||||
for (idx, seg) in other.iter().enumerate() {
|
||||
if self.get_segment(idx) != Some(seg) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
true
|
||||
}
|
||||
pub fn strip_prefix<'a, S0: AsRef<str>, V0: SegList>(
|
||||
&'a self,
|
||||
other: &JsonPointer<S0, V0>,
|
||||
) -> Option<JsonPointer<&'a str, (&'a [PtrSegment], &'a [PtrSegment])>> {
|
||||
for (idx, seg) in other.iter().enumerate() {
|
||||
if self.get_segment(idx) != Some(seg) {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
if self.len() == other.len() {
|
||||
return Some(Default::default());
|
||||
}
|
||||
let src_start = self.segments.get(other.segments.len())?.range().start - 1;
|
||||
Some(JsonPointer {
|
||||
src: &self.src.as_ref()[src_start..],
|
||||
offset: src_start,
|
||||
segments: self.segments.slice(other.segments.len()..)?,
|
||||
})
|
||||
}
|
||||
pub fn slice<R: RangeBounds<usize>>(
|
||||
&self,
|
||||
range: R,
|
||||
) -> Option<JsonPointer<&str, (&[PtrSegment], &[PtrSegment])>> {
|
||||
let mut s = self.src.as_ref();
|
||||
let seg = self.segments.slice(range)?;
|
||||
let mut iter = seg.0.iter().chain(seg.1.iter());
|
||||
let offset;
|
||||
if let Some(first) = iter.next() {
|
||||
let last = iter.next_back().unwrap_or(first);
|
||||
offset = first.range().start - 1;
|
||||
s = &s[first.range().start - 1 - self.offset..last.range().end - self.offset];
|
||||
} else {
|
||||
offset = 0;
|
||||
s = "";
|
||||
}
|
||||
Some(JsonPointer {
|
||||
src: s,
|
||||
offset,
|
||||
segments: seg,
|
||||
})
|
||||
}
|
||||
pub fn iter<'a>(&'a self) -> JsonPointerIter<'a, S, V> {
|
||||
JsonPointerIter {
|
||||
ptr: self,
|
||||
start: 0,
|
||||
end: self.segments.len(),
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> JsonPointer<S, V> {
|
||||
pub fn into_iter(self) -> JsonPointerIntoIter<S, V> {
|
||||
JsonPointerIntoIter {
|
||||
src: self.src,
|
||||
iter: self.segments.into_iter(),
|
||||
}
|
||||
}
|
||||
}
|
||||
impl JsonPointer<String> {
|
||||
pub fn push_end(&mut self, segment: &str) {
|
||||
let mut escaped = false;
|
||||
self.src.push('/');
|
||||
let start = self.src.len();
|
||||
for c in segment.chars() {
|
||||
match c {
|
||||
'~' => {
|
||||
self.src += "~0";
|
||||
escaped = true;
|
||||
}
|
||||
'/' => {
|
||||
self.src += "~1";
|
||||
escaped = true;
|
||||
}
|
||||
_ => {
|
||||
self.src.push(c);
|
||||
}
|
||||
}
|
||||
}
|
||||
self.segments.push_back(if escaped {
|
||||
PtrSegment::Escaped(start..self.src.len(), segment.to_string())
|
||||
} else {
|
||||
PtrSegment::Unescaped(start..self.src.len())
|
||||
})
|
||||
}
|
||||
pub fn push_end_idx(&mut self, segment: usize) {
|
||||
use std::fmt::Write;
|
||||
let start = self.src.len() + 1;
|
||||
write!(self.src, "/{}", segment).unwrap();
|
||||
let end = self.src.len();
|
||||
self.segments.push_back(PtrSegment::Unescaped(start..end));
|
||||
}
|
||||
pub fn push_start(&mut self, segment: &str) {
|
||||
let escaped = segment.chars().filter(|c| *c == '~' || *c == '/').count();
|
||||
let prefix_len = segment.len() + escaped + 1;
|
||||
let mut src = String::with_capacity(self.src.len() + prefix_len);
|
||||
src.push('/');
|
||||
for c in segment.chars() {
|
||||
match c {
|
||||
'~' => {
|
||||
src += "~0";
|
||||
}
|
||||
'/' => {
|
||||
src += "~1";
|
||||
}
|
||||
_ => {
|
||||
src.push(c);
|
||||
}
|
||||
}
|
||||
}
|
||||
src += self.src.as_str();
|
||||
for seg in self.segments.iter_mut() {
|
||||
let range = seg.range_mut();
|
||||
range.start += prefix_len;
|
||||
range.end += prefix_len;
|
||||
}
|
||||
self.src = src;
|
||||
self.segments.push_front(if escaped > 0 {
|
||||
PtrSegment::Escaped(1..prefix_len, segment.to_owned())
|
||||
} else {
|
||||
PtrSegment::Unescaped(1..prefix_len)
|
||||
});
|
||||
}
|
||||
// @claude fix #3: Previously just inserted the new segment without adjusting
|
||||
// existing segment ranges (unlike push_start which did shift them). All
|
||||
// existing segments' byte ranges became wrong, causing corrupted pointer
|
||||
// lookups or panics. Now shifts ranges forward by prefix_len, matching
|
||||
// push_start behavior.
|
||||
pub fn push_start_idx(&mut self, segment: usize) {
|
||||
let mut src = format!("/{}", segment);
|
||||
let prefix_len = src.len();
|
||||
src += self.src.as_str();
|
||||
for seg in self.segments.iter_mut() {
|
||||
let range = seg.range_mut();
|
||||
range.start += prefix_len;
|
||||
range.end += prefix_len;
|
||||
}
|
||||
self.src = src;
|
||||
self.segments.insert(0, PtrSegment::Unescaped(1..prefix_len));
|
||||
}
|
||||
pub fn pop_end(&mut self) {
|
||||
if let Some(last) = self.segments.pop_back() {
|
||||
self.src.truncate(last.range().start - 1)
|
||||
}
|
||||
}
|
||||
pub fn pop_start(&mut self) {
|
||||
if let Some(last) = self.segments.pop_front() {
|
||||
let range = last.into_range();
|
||||
self.src.replace_range(range.start - 1..range.end, "");
|
||||
}
|
||||
}
|
||||
pub fn truncate(&mut self, new_len: usize) {
|
||||
if let Some(seg) = self.segments.get(new_len) {
|
||||
self.src.truncate(seg.range().start - 1);
|
||||
self.segments.truncate(new_len);
|
||||
}
|
||||
}
|
||||
pub fn join_end(mut self, segment: &str) -> Self {
|
||||
self.push_end(segment);
|
||||
self
|
||||
}
|
||||
pub fn join_end_idx(mut self, segment: usize) -> Self {
|
||||
self.push_end_idx(segment);
|
||||
self
|
||||
}
|
||||
pub fn join_start(mut self, segment: &str) -> Self {
|
||||
self.push_start(segment);
|
||||
self
|
||||
}
|
||||
pub fn join_start_idx(mut self, segment: usize) -> Self {
|
||||
self.push_start_idx(segment);
|
||||
self
|
||||
}
|
||||
pub fn append<S: AsRef<str>, V: SegList>(&mut self, suffix: &JsonPointer<S, V>) {
|
||||
for seg in suffix.iter() {
|
||||
self.push_end(seg)
|
||||
}
|
||||
}
|
||||
pub fn prepend<S: AsRef<str>, V: SegList>(&mut self, prefix: &JsonPointer<S, V>) {
|
||||
for seg in prefix.iter().rev() {
|
||||
self.push_start(seg);
|
||||
}
|
||||
}
|
||||
}
|
||||
impl FromStr for JsonPointer<String> {
|
||||
type Err = ParseError;
|
||||
fn from_str(s: &str) -> Result<Self, Self::Err> {
|
||||
JsonPointer::parse(s.to_owned())
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> AsRef<str> for JsonPointer<S, V> {
|
||||
fn as_ref(&self) -> &str {
|
||||
self.src.as_ref()
|
||||
}
|
||||
}
|
||||
impl<S0, S1, V0, V1> PartialEq<JsonPointer<S1, V1>> for JsonPointer<S0, V0>
|
||||
where
|
||||
S0: AsRef<str>,
|
||||
S1: AsRef<str>,
|
||||
V0: SegList,
|
||||
V1: SegList,
|
||||
{
|
||||
fn eq(&self, rhs: &JsonPointer<S1, V1>) -> bool {
|
||||
self.segments.len() == rhs.segments.len() && {
|
||||
let mut rhs_iter = rhs.iter();
|
||||
self.iter().all(|lhs| Some(lhs) == rhs_iter.next())
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> Hash for JsonPointer<S, V> {
|
||||
fn hash<H: Hasher>(&self, state: &mut H) {
|
||||
for seg in self.iter() {
|
||||
seg.hash(state);
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: SegList> std::fmt::Display for JsonPointer<S, V> {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
std::fmt::Display::fmt(self.src.as_ref(), f)
|
||||
}
|
||||
}
|
||||
impl<'a, S0, S1, V0, V1> Add<&'a JsonPointer<S1, V1>> for JsonPointer<S0, V0>
|
||||
where
|
||||
S0: AsRef<str> + Add<&'a str>,
|
||||
S0::Output: AsRef<str>,
|
||||
S1: AsRef<str>,
|
||||
V0: SegList + Extend<PtrSegment>,
|
||||
V1: SegList,
|
||||
for<'v> &'v V1: IntoIterator<Item = &'v PtrSegment>,
|
||||
{
|
||||
type Output = JsonPointer<S0::Output, V0>;
|
||||
fn add(mut self, rhs: &'a JsonPointer<S1, V1>) -> Self::Output {
|
||||
let src_len = self.src.as_ref().len();
|
||||
let offset = self.offset;
|
||||
self.segments
|
||||
.extend((&rhs.segments).into_iter().map(|seg| match seg {
|
||||
PtrSegment::Unescaped(range) => PtrSegment::Unescaped(
|
||||
range.start - rhs.offset + src_len + offset
|
||||
..range.end - rhs.offset + src_len + offset,
|
||||
),
|
||||
PtrSegment::Escaped(range, s) => PtrSegment::Escaped(
|
||||
range.start - rhs.offset + src_len + offset
|
||||
..range.end - rhs.offset + src_len + offset,
|
||||
s.clone(),
|
||||
),
|
||||
}));
|
||||
JsonPointer {
|
||||
src: self.src + rhs.src.as_ref(),
|
||||
offset,
|
||||
segments: self.segments,
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<'a, S0, S1, V0, V1> Add<&'a JsonPointer<S1, V1>> for &JsonPointer<S0, V0>
|
||||
where
|
||||
S0: AsRef<str> + Add<&'a str> + Clone,
|
||||
S0::Output: AsRef<str>,
|
||||
S1: AsRef<str>,
|
||||
V0: SegList + Clone + Extend<PtrSegment>,
|
||||
V1: SegList,
|
||||
for<'v> &'v V1: IntoIterator<Item = &'v PtrSegment>,
|
||||
{
|
||||
type Output = JsonPointer<S0::Output, V0>;
|
||||
fn add(self, rhs: &'a JsonPointer<S1, V1>) -> Self::Output {
|
||||
let src_len = self.src.as_ref().len();
|
||||
let mut segments = self.segments.clone();
|
||||
segments.extend((&rhs.segments).into_iter().map(|seg| match seg {
|
||||
PtrSegment::Unescaped(range) => PtrSegment::Unescaped(
|
||||
range.start - rhs.offset + src_len + self.offset
|
||||
..range.end - rhs.offset + src_len + self.offset,
|
||||
),
|
||||
PtrSegment::Escaped(range, s) => PtrSegment::Escaped(
|
||||
range.start - rhs.offset + src_len + self.offset
|
||||
..range.end - rhs.offset + src_len + self.offset,
|
||||
s.clone(),
|
||||
),
|
||||
}));
|
||||
JsonPointer {
|
||||
src: self.src.clone() + rhs.src.as_ref(),
|
||||
offset: self.offset,
|
||||
segments,
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<'a, S0, S1, V0, V1> AddAssign<&'a JsonPointer<S1, V1>> for JsonPointer<S0, V0>
|
||||
where
|
||||
S0: AsRef<str> + AddAssign<&'a str>,
|
||||
S1: AsRef<str>,
|
||||
V0: SegList + Extend<PtrSegment>,
|
||||
V1: SegList,
|
||||
for<'v> &'v V1: IntoIterator<Item = &'v PtrSegment>,
|
||||
{
|
||||
fn add_assign(&mut self, rhs: &'a JsonPointer<S1, V1>) {
|
||||
let src_len = self.src.as_ref().len();
|
||||
let offset = self.offset;
|
||||
self.segments
|
||||
.extend((&rhs.segments).into_iter().map(|seg| match seg {
|
||||
PtrSegment::Unescaped(range) => PtrSegment::Unescaped(
|
||||
range.start - rhs.offset + src_len + offset
|
||||
..range.end - rhs.offset + src_len + offset,
|
||||
),
|
||||
PtrSegment::Escaped(range, s) => PtrSegment::Escaped(
|
||||
range.start - rhs.offset + src_len + offset
|
||||
..range.end - rhs.offset + src_len + offset,
|
||||
s.clone(),
|
||||
),
|
||||
}));
|
||||
self.src += rhs.src.as_ref();
|
||||
}
|
||||
}
|
||||
impl<'de, S> serde::de::Deserialize<'de> for JsonPointer<S>
|
||||
where
|
||||
S: AsRef<str> + serde::de::Deserialize<'de>,
|
||||
{
|
||||
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
|
||||
where
|
||||
D: serde::de::Deserializer<'de>,
|
||||
{
|
||||
Ok(JsonPointer::parse(S::deserialize(deserializer)?).map_err(serde::de::Error::custom)?)
|
||||
}
|
||||
}
|
||||
impl<S> serde::ser::Serialize for JsonPointer<S>
|
||||
where
|
||||
S: AsRef<str> + serde::ser::Serialize,
|
||||
{
|
||||
fn serialize<Ser>(&self, serializer: Ser) -> Result<Ser::Ok, Ser::Error>
|
||||
where
|
||||
Ser: serde::ser::Serializer,
|
||||
{
|
||||
self.src.serialize(serializer)
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum PtrSegment {
|
||||
Unescaped(Range<usize>),
|
||||
Escaped(Range<usize>, String),
|
||||
}
|
||||
impl PtrSegment {
|
||||
fn range(&self) -> &Range<usize> {
|
||||
match self {
|
||||
PtrSegment::Unescaped(range) => range,
|
||||
PtrSegment::Escaped(range, _) => range,
|
||||
}
|
||||
}
|
||||
fn range_mut(&mut self) -> &mut Range<usize> {
|
||||
match self {
|
||||
PtrSegment::Unescaped(range) => range,
|
||||
PtrSegment::Escaped(range, _) => range,
|
||||
}
|
||||
}
|
||||
fn into_range(self) -> Range<usize> {
|
||||
match self {
|
||||
PtrSegment::Unescaped(range) => range,
|
||||
PtrSegment::Escaped(range, _) => range,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct JsonPointerIter<'a, S: AsRef<str> + 'a, V: SegList> {
|
||||
ptr: &'a JsonPointer<S, V>,
|
||||
start: usize,
|
||||
end: usize,
|
||||
}
|
||||
impl<'a, S: AsRef<str>, V: SegList> Iterator for JsonPointerIter<'a, S, V> {
|
||||
type Item = &'a str;
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
if self.start < self.end {
|
||||
let ret = self.ptr.get_segment(self.start);
|
||||
self.start += 1;
|
||||
ret
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
fn size_hint(&self) -> (usize, Option<usize>) {
|
||||
let size = self.end - self.start;
|
||||
(size, Some(size))
|
||||
}
|
||||
}
|
||||
impl<'a, S: AsRef<str>, V: SegList> DoubleEndedIterator for JsonPointerIter<'a, S, V> {
|
||||
fn next_back(&mut self) -> Option<Self::Item> {
|
||||
if self.start < self.end {
|
||||
self.end -= 1;
|
||||
self.ptr.get_segment(self.end)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub struct JsonPointerIntoIter<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> {
|
||||
src: S,
|
||||
iter: V::IntoIter,
|
||||
}
|
||||
impl<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> JsonPointerIntoIter<S, V> {
|
||||
fn next<'a>(&'a mut self) -> Option<Cow<'a, str>> {
|
||||
if let Some(seg) = self.iter.next() {
|
||||
Some(match seg {
|
||||
PtrSegment::Unescaped(range) => Cow::Borrowed(&self.src.as_ref()[range]),
|
||||
PtrSegment::Escaped(_, s) => Cow::Owned(s),
|
||||
})
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> JsonPointerIntoIter<S, V>
|
||||
where
|
||||
V::IntoIter: DoubleEndedIterator,
|
||||
{
|
||||
fn next_back<'a>(&'a mut self) -> Option<Cow<'a, str>> {
|
||||
if let Some(seg) = self.iter.next_back() {
|
||||
Some(match seg {
|
||||
PtrSegment::Unescaped(range) => Cow::Borrowed(&self.src.as_ref()[range]),
|
||||
PtrSegment::Escaped(_, s) => Cow::Owned(s),
|
||||
})
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> Iterator
|
||||
for JsonPointerIntoIter<S, V>
|
||||
{
|
||||
type Item = String;
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
self.next().map(|s| s.to_string())
|
||||
}
|
||||
fn size_hint(&self) -> (usize, Option<usize>) {
|
||||
self.iter.size_hint()
|
||||
}
|
||||
}
|
||||
impl<S: AsRef<str>, V: IntoIterator<Item = PtrSegment> + SegList> DoubleEndedIterator
|
||||
for JsonPointerIntoIter<S, V>
|
||||
where
|
||||
V::IntoIter: DoubleEndedIterator,
|
||||
{
|
||||
fn next_back(&mut self) -> Option<Self::Item> {
|
||||
self.next_back().map(|s| s.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn uncons_base() {
|
||||
let base: JsonPointer = "".parse().unwrap();
|
||||
assert_eq!(base.uncons(), None)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn uncons_inductive() {
|
||||
let inductive: JsonPointer = "/test/check".parse().unwrap();
|
||||
let (first, rest) = inductive.uncons().unwrap();
|
||||
assert_eq!(first, "test");
|
||||
assert_eq!(rest, "/check".parse::<JsonPointer>().unwrap());
|
||||
}
|
||||
@@ -12,6 +12,6 @@ proc-macro = true
|
||||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
|
||||
|
||||
[dependencies]
|
||||
patch-db-macro-internals = { path = "../patch-db-macro-internals" }
|
||||
patch-db-macro-internals = { path = "../macro-internals" }
|
||||
syn = "1.0.62"
|
||||
proc-macro2 = "1.0.1"
|
||||
@@ -7,6 +7,6 @@ edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
clap = "3.2.16"
|
||||
patch-db = { path = "../patch-db", features = ["debug"] }
|
||||
patch-db = { path = "../core", features = ["debug"] }
|
||||
serde_json = "1.0.85"
|
||||
tokio = { version = "1.20.1", features = ["full"] }
|
||||
Reference in New Issue
Block a user