Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions .github/workflows/tests.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Tests
on: [ push, pull_request, workflow_dispatch ]

permissions: {}

concurrency:
group: tests-${{ github.ref }}
cancel-in-progress: true

jobs:
pytest:
runs-on: ubuntu-latest
permissions:
contents: read
strategy:
fail-fast: false
matrix:
python-version: [ "3.12", "3.13", "3.14" ]
name: pytest (Python ${{ matrix.python-version }})
steps:
- name: Checkout
uses: actions/checkout@v5

- name: Install uv
uses: astral-sh/setup-uv@v6

- name: Install Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

- name: Install dependencies
run: uv sync --all-extras --python ${{ matrix.python-version }}

# Network-dependent tests need a live OSH server (e.g. localhost:8282).
# They're tagged `@pytest.mark.network` and skipped here. The plan is
# to shim those with mocks; once a test no longer needs a real server,
# drop the marker and it will run in CI automatically.
- name: Run pytest with coverage
run: |
uv run --python ${{ matrix.python-version }} pytest -v \
-m "not network" \
--cov --cov-report=term --cov-report=xml

# Keep coverage.xml around so a later badge/Codecov upload step can use it.
- name: Upload coverage report artifact
if: always()
uses: actions/upload-artifact@v4
with:
name: coverage-${{ matrix.python-version }}
path: coverage.xml
if-no-files-found: warn
retention-days: 7
54 changes: 54 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,60 @@ Links:
* [Architecture Doc](https://docs.google.com/document/d/1pIaeQw0ocU6ApNgqTVRZuSwjJAbhCcmweMq6RiVYEic/edit?usp=sharing)
* [UML Diagram](https://drive.google.com/file/d/1FVrnYiuAR8ykqfOUa1NuoMyZ1abXzMPw/view?usp=drive_link)

## Running Tests

```bash
uv sync # install dev deps (incl. pytest, pytest-cov)
uv run pytest # full suite (skips network-marked tests if you add `-m "not network"`)
uv run pytest tests/test_swe_components.py -v # one file, verbose
uv run pytest -k name_token # one keyword
```

Tests that need a live OSH server (e.g. `localhost:8282` running
FakeWeatherDriver) are tagged `@pytest.mark.network`. CI skips them; locally
you can include or exclude them:

```bash
uv run pytest -m "not network" # what CI runs
uv run pytest -m network # only the live-server tests
```

## Test Coverage

Coverage is opt-in via [`pytest-cov`](https://pytest-cov.readthedocs.io/). The
default `pytest` run is fast; add `--cov` when you want a report.

```bash
uv run pytest --cov # terminal summary + missing lines
uv run pytest --cov --cov-report=html # HTML report at htmlcov/index.html
uv run pytest --cov --cov-report=xml # coverage.xml (CI / Codecov-ready)
```

Configuration lives in `pyproject.toml` under `[tool.coverage.*]` — branch
coverage is on, source is scoped to `src/oshconnect`, and obvious dead lines
(`if TYPE_CHECKING:`, `raise NotImplementedError`, etc.) are excluded.

CI (`.github/workflows/tests.yaml`) runs the suite with `--cov` on every push
across Python 3.12 / 3.13 / 3.14 and uploads `coverage.xml` as a workflow
artifact (downloadable from the run page).

## Documentation Coverage

[`interrogate`](https://interrogate.readthedocs.io/) reports what fraction of
public modules / classes / functions / methods carry a docstring (presence
only, it doesn't check style). It's purely informational right now; there's
no CI gate. Configuration lives in `pyproject.toml` under `[tool.interrogate]`
(`__init__`, dunder, private, and property/setter members are skipped).

```bash
uv run interrogate src/oshconnect # one-line summary
uv run interrogate -v src/oshconnect # per-file table
uv run interrogate -vv src/oshconnect # per-symbol (shows which symbols are missing)
```

Once we agree on a baseline, raise `[tool.interrogate].fail-under` from `0` so
new code without docstrings starts failing locally and in CI.

## Generating the Docs

The documentation is built with [MkDocs](https://www.mkdocs.org/) using the
Expand Down
45 changes: 44 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "oshconnect"
version = "0.5.0a0"
version = "0.5.0a1"
description = "Library for interfacing with OSH, helping guide visualization efforts, and providing a place to store configurations. Implements OGC CS API Part 3 (Pub/Sub) MQTT topic conventions including :data topics and resource event topics."
readme = "README.md"
authors = [
Expand All @@ -19,6 +19,8 @@ dependencies = [
dev = [
"flake8>=7.2.0",
"pytest>=8.3.5",
"pytest-cov>=5.0.0",
"interrogate>=1.7.0",
"sphinx>=7.4.7",
"sphinx-rtd-theme>=2.0.0",
"mkdocs-material>=9.5.0",
Expand All @@ -31,3 +33,44 @@ packages = {find = { where = ["src/"]}}

[tool.pytest.ini_options]
pythonpath = ["src"]
markers = [
"network: test requires a live OSH server or external network endpoint (skipped by default in CI; see workflow `tests.yaml`).",
]

# Coverage is opt-in (run with `pytest --cov`) so the default `pytest` run stays fast.
# `--cov` with no argument picks up the source paths configured below.

[tool.coverage.run]
source = ["src/oshconnect"]
branch = true

[tool.coverage.report]
show_missing = true
skip_covered = false
precision = 2
exclude_lines = [
"pragma: no cover",
"raise NotImplementedError",
"if TYPE_CHECKING:",
"@(abc\\.)?abstractmethod",
"if __name__ == .__main__.:",
]

[tool.coverage.html]
directory = "htmlcov"

[tool.coverage.xml]
output = "coverage.xml"

# Docstring presence (not style). Run with `uv run interrogate -v src/oshconnect`.
[tool.interrogate]
ignore-init-method = true # constructors covered by class docstring
ignore-init-module = true # don't require docstrings on bare __init__.py
ignore-magic = true # skip dunder methods (__repr__, __eq__, etc.)
ignore-private = true # skip _name and __name (non-dunder) members
ignore-property-decorators = true
ignore-nested-functions = true
ignore-setters = true
fail-under = 0 # report-only for now; raise once a baseline is set
exclude = ["tests", "docs", "build", ".venv", "scripts"]
verbose = 2 # 0=summary, 1=per-file, 2=per-symbol
34 changes: 17 additions & 17 deletions src/oshconnect/datastores/sqlite_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,14 +30,14 @@ class SQLiteDataStore(DataStore):
Schema notes
------------
Each resource type is stored as a single JSON blob (the output of its
``serialize()`` method) alongside a primary-key string ID and any foreign-key
columns needed for filtered lookups. Using blobs means new Pydantic fields
do not require schema migrations.
``to_storage_dict()`` method) alongside a primary-key string ID and any
foreign-key columns needed for filtered lookups. Using blobs means new
Pydantic fields do not require schema migrations.

*Bulk operations* (``save_all`` / ``load_all``) work at the Node level:
``save_all`` persists every resource separately for individual lookups;
``load_all`` reconstructs the full hierarchy from the *nodes* table only
(``Node.deserialize`` handles the embedded systems/streams), avoiding
(``Node.from_storage_dict`` handles the embedded systems/streams), avoiding
duplication.
"""

Expand Down Expand Up @@ -87,7 +87,7 @@ def _execute(self, sql: str, params: tuple = ()) -> sqlite3.Cursor:
# ------------------------------------------------------------------

def save_node(self, node: Node) -> None:
data = json.dumps(node.serialize())
data = json.dumps(node.to_storage_dict())
self._execute(
"INSERT OR REPLACE INTO nodes (id, data) VALUES (?, ?)",
(node.get_id(), data),
Expand All @@ -102,14 +102,14 @@ def load_node(
).fetchone()
if row is None:
return None
return Node.deserialize(json.loads(row["data"]), session_manager=session_manager)
return Node.from_storage_dict(json.loads(row["data"]), session_manager=session_manager)

def load_all_nodes(
self, session_manager: Optional[SessionManager] = None
) -> list[Node]:
rows = self._execute("SELECT data FROM nodes").fetchall()
return [
Node.deserialize(json.loads(r["data"]), session_manager=session_manager)
Node.from_storage_dict(json.loads(r["data"]), session_manager=session_manager)
for r in rows
]

Expand All @@ -123,7 +123,7 @@ def delete_node(self, node_id: str) -> None:

def save_system(self, system: System, node: Node) -> None:
system_id = str(system.get_internal_id())
data = json.dumps(system.serialize())
data = json.dumps(system.to_storage_dict())
self._execute(
"INSERT OR REPLACE INTO systems (id, node_id, data) VALUES (?, ?, ?)",
(system_id, node.get_id(), data),
Expand All @@ -136,13 +136,13 @@ def load_system(self, system_id: str, node: Node) -> Optional[System]:
).fetchone()
if row is None:
return None
return System.deserialize(json.loads(row["data"]), node)
return System.from_storage_dict(json.loads(row["data"]), node)

def load_systems_for_node(self, node_id: str, node: Node) -> list[System]:
rows = self._execute(
"SELECT data FROM systems WHERE node_id = ?", (node_id,)
).fetchall()
return [System.deserialize(json.loads(r["data"]), node) for r in rows]
return [System.from_storage_dict(json.loads(r["data"]), node) for r in rows]

def delete_system(self, system_id: str) -> None:
self._execute("DELETE FROM systems WHERE id = ?", (system_id,))
Expand All @@ -155,7 +155,7 @@ def delete_system(self, system_id: str) -> None:
def save_datastream(self, datastream: Datastream, node: Node) -> None:
ds_id = str(datastream.get_internal_id())
system_id = datastream.get_parent_resource_id()
data = json.dumps(datastream.serialize())
data = json.dumps(datastream.to_storage_dict())
self._execute(
"INSERT OR REPLACE INTO datastreams (id, system_id, node_id, data) VALUES (?, ?, ?, ?)",
(ds_id, system_id, node.get_id(), data),
Expand All @@ -168,13 +168,13 @@ def load_datastream(self, datastream_id: str, node: Node) -> Optional[Datastream
).fetchone()
if row is None:
return None
return Datastream.deserialize(json.loads(row["data"]), node)
return Datastream.from_storage_dict(json.loads(row["data"]), node)

def load_datastreams_for_system(self, system_id: str, node: Node) -> list[Datastream]:
rows = self._execute(
"SELECT data FROM datastreams WHERE system_id = ?", (system_id,)
).fetchall()
return [Datastream.deserialize(json.loads(r["data"]), node) for r in rows]
return [Datastream.from_storage_dict(json.loads(r["data"]), node) for r in rows]

def delete_datastream(self, datastream_id: str) -> None:
self._execute("DELETE FROM datastreams WHERE id = ?", (datastream_id,))
Expand All @@ -187,7 +187,7 @@ def delete_datastream(self, datastream_id: str) -> None:
def save_controlstream(self, controlstream: ControlStream, node: Node) -> None:
cs_id = str(controlstream.get_internal_id())
system_id = controlstream.get_parent_resource_id()
data = json.dumps(controlstream.serialize())
data = json.dumps(controlstream.to_storage_dict())
self._execute(
"INSERT OR REPLACE INTO controlstreams (id, system_id, node_id, data) VALUES (?, ?, ?, ?)",
(cs_id, system_id, node.get_id(), data),
Expand All @@ -200,13 +200,13 @@ def load_controlstream(self, controlstream_id: str, node: Node) -> Optional[Cont
).fetchone()
if row is None:
return None
return ControlStream.deserialize(json.loads(row["data"]), node)
return ControlStream.from_storage_dict(json.loads(row["data"]), node)

def load_controlstreams_for_system(self, system_id: str, node: Node) -> list[ControlStream]:
rows = self._execute(
"SELECT data FROM controlstreams WHERE system_id = ?", (system_id,)
).fetchall()
return [ControlStream.deserialize(json.loads(r["data"]), node) for r in rows]
return [ControlStream.from_storage_dict(json.loads(r["data"]), node) for r in rows]

def delete_controlstream(self, controlstream_id: str) -> None:
self._execute("DELETE FROM controlstreams WHERE id = ?", (controlstream_id,))
Expand All @@ -232,7 +232,7 @@ def load_all(
) -> list[Node]:
"""Reconstruct the full resource graph from the nodes table.

``Node.deserialize`` handles the embedded systems/datastreams/
``Node.from_storage_dict`` handles the embedded systems/datastreams/
controlstreams hierarchy, so only the *nodes* table is used here.
The individual resource tables (systems, datastreams, controlstreams)
exist for targeted single-resource lookups and are not consulted here
Expand Down
2 changes: 1 addition & 1 deletion src/oshconnect/oshconnectapi.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ def save_config(self):

data = {}
for node in self._nodes:
node_dict = node.serialize()
node_dict = node.to_storage_dict()
data.update({node.get_id(): node_dict})

# write to JSON file
Expand Down
Loading
Loading