Compare commits

...

3 Commits

Author SHA1 Message Date
ac20f67f12 Decode wmedf signal data, add --signals to session and export commands
Implement full EDF signal decoder for 18 Löwenstein channels: parse
interleaved signal headers, decode i16 LE data records with per-channel
sample rates, and clamp 8-bit-range channels to declared digital range.

Move sample_rate_hz from SignalBlock to SignalChannel to support
per-channel rates (Pressure 0.278Hz, RespFlow 0.556Hz, others 0.056Hz).
Add SignalLabel variants: EEPAPTarget, AMV, SPRStatus, TotalLeakage, RSBI.

CLI: `tidal session <id> --signals` shows channel summary table.
Export: `tidal export --signals` includes per-channel summaries in JSON.

Fix CLAUDE.md mapping table to match code: clarify AASM reclassification
footnote for IDs 106/111/151, add missing ID 1118, document 1007 as
separate detection pathway from ID 2.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 19:44:57 +02:00
dba07cc957 doc: decode signals planning 2026-03-28 19:43:37 +02:00
9517f0f964 Fix hypopnea overcounting, filter device lifecycle from reports
Reclassify sub-threshold hypopneas (Duration < 10s) as FlowLimitation
at parse time per AASM 2012 criteria. Corrects AHI from ~606 to ~216
for the test dataset.

Add EventType::is_clinical() to distinguish respiratory/therapy events
from device lifecycle signals. Reports and summaries filter to clinical
events by default; full record accessible via --events.

Clean up all compiler warnings in tidal-devices.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 18:41:02 +02:00
8 changed files with 387 additions and 120 deletions

View File

@@ -69,7 +69,7 @@ mnt/flash/
**Confirmed RespEventID mapping:**
All events are surfaced to the user. The device marks some events `Visible="0"` but Tidal does not respect manufacturer visibility directives — data about a user belongs to the user.
All events are stored the user owns all their data. Reports and summaries filter to clinical events by default (`EventType::is_clinical()`), excluding device lifecycle signals (SessionStart, SessionEnd, Unknown). The full event record is always accessible via `tidal session <id> --events`.
| ID | EventType | Notes |
|----|-----------|-------|
@@ -80,15 +80,15 @@ All events are surfaced to the user. The device marks some events `Visible="0"`
| 101 | RERA | Respiratory Effort Related Arousal |
| 102 | RERA | RERA variant, Strength 0-9 |
| 103 | FlowLimitation | Intermediate severity variant, Strength 34-87 |
| 106 | Hypopnea | Alternative detection criteria, possibly FOT-based |
| 106 | Hypopnea* | Alternative detection criteria, possibly FOT-based |
| 108 | FlowLimitation | Strong respiratory effort variant, Strength 11-97 (never zero) |
| 111 | Hypopnea | Obstructive, often paired with 1008 pressure response |
| 111 | Hypopnea* | Obstructive, often paired with 1008 pressure response |
| 112 | Snore | Strength 0-100 |
| 113 | Snore | Snore variant, Strength 40-45 |
| 121 | FlowLimitation | Always paired with 1129 (skip 1129) |
| 131 | FlowLimitation | Mild, Strength varies |
| 141 | LargeLeak | Extended leak episode, paired with 1102 |
| 151 | Hypopnea | Central/mixed variant |
| 151 | Hypopnea* | Central/mixed variant, primary hypopnea source |
| 161 | RERA | |
| 171 | SessionStart | |
| 181 | PressureOptimisation | Auto-titration cycle, Duration 1163-1915cs, Pressure field active (11.5-16.0 hPa) |
@@ -102,12 +102,13 @@ All events are surfaced to the user. The device marks some events `Visible="0"`
| 308 | TherapyPause | Mid-session pause marker, Duration=0, paired with 309 |
| 309 | TherapyPause | Pause duration, follows 308, Duration=20-300cs |
| 330 | (skip) | Unknown, Duration always 100 |
| 1007 | CentralApnea | FOT-detected central variant |
| 1007 | CentralApnea | FOT-detected central, separate detection pathway from ID 2 (non-overlapping timestamps) |
| 1008 | PressureIncrease | Pressure=100 means 10.0 hPa response |
| 1101 | (skip) | FOT obstructive detection signal |
| 1101 | (skip) | FOT obstructive detection signal — secondary firmware scoring layer, unique timestamps |
| 1102 | (skip) | FOT pair signal for 141 (large leak) |
| 1111 | (skip) | FOT central detection signal |
| 1111 | (skip) | FOT central detection signal — secondary firmware scoring layer, unique timestamps |
| 1112 | MixedApnea | FOT mixed signal |
| 1118 | (skip) | FOT signal, unclassified — secondary firmware scoring layer |
| 1126 | LargeLeak | |
| 1129 | (skip) | FlowLimitation pair signal |
| 1230 | SessionStart | Init sentinel, EndTime always 1cs, once per session |
@@ -119,6 +120,8 @@ All events are surfaced to the user. The device marks some events `Visible="0"`
| 1237 | SessionStart | Init handshake completion, EndTime always 11cs |
| 1238 | SessionStart | Session initialisation |
\* **AASM reclassification**: IDs 106, 111, 151 map to Hypopnea but events with Duration < 1000cs (10s) are reclassified as FlowLimitation at parse time per AASM 2012 minimum duration criteria. ID 151 is the primary hypopnea source — in session 003344, 8 of 115 events (7%) met the 10s threshold.
**DeviceEvent ParameterID mapping (at Time="0"):**
| ParameterID | Meaning | Scaling |
@@ -156,7 +159,7 @@ This context is documented here not as colour but because it directly shapes des
**Time units:** Device-native centiseconds (cs) are preserved in event parsing. Convert to seconds for display: `cs as f32 / 100.0`
**AHI calculation:** Obstructive + Central + Mixed + Hypopnea events / session duration in hours. Hypopneas are included per AASM 2012 guidelines.
**AHI calculation:** Obstructive + Central + Mixed + Hypopnea events / session duration in hours. Hypopneas are included per AASM 2012 guidelines. Only hypopneas with Duration ≥ 1000cs (10 seconds) are counted — sub-threshold events from IDs 106/111/151 are reclassified as FlowLimitation at parse time rather than dropped, preserving all data while producing clinically accurate AHI.
**Event source:** Always tag events as `EventSource::DeviceReported` when parsed from device data. `EventSource::TidalDerived` is reserved for events computed by tidal-core analysis algorithms layered on top.

View File

@@ -51,7 +51,8 @@ pub fn render_text(ctx: &ExportContext) -> String {
}
}
writeln!(out, "AHI per AASM 2012: (Obstructive + Central + Mixed + Hypopnea) / therapy hours").unwrap();
writeln!(out, "AHI per AASM 2012: (Obstructive + Central + Mixed + Hypopnea ≥10s) / therapy hours").unwrap();
writeln!(out, "AHI is computed over recorded therapy time only, excluding mask-off gaps between sessions").unwrap();
writeln!(out, "Generated by Tidal · not a medical device").unwrap();
out
}
@@ -74,9 +75,13 @@ fn write_period_summary_text(out: &mut String, ns: &PeriodSummary) {
}
fn write_event_timeline_text(out: &mut String, session: &Session) {
writeln!(out, " Events for session {} ({})", session.id.0, session.started_at.format("%H:%M")).unwrap();
let clinical: Vec<_> = session.events.iter()
.filter(|e| e.event_type.is_clinical())
.collect();
writeln!(out, " Events for session {} ({}) — {} clinical events",
session.id.0, session.started_at.format("%H:%M"), clinical.len()).unwrap();
writeln!(out, " {:>8} {:>6} {:<20} {:>5} {:>8}", "offset", "dur", "type", "str", "hPa").unwrap();
for event in &session.events {
for event in &clinical {
let offset_s = event.end_offset_cs as f32 / 100.0;
let dur_s = event.duration_cs as f32 / 100.0;
let strength = event.strength.map(|s| format!("{}", s)).unwrap_or_default();
@@ -134,17 +139,21 @@ pub fn render_markdown(ctx: &ExportContext) -> String {
writeln!(out, "---").unwrap();
writeln!(out).unwrap();
writeln!(out, "*AHI per AASM 2012: (Obstructive + Central + Mixed + Hypopnea) / therapy hours*").unwrap();
writeln!(out, "*AHI per AASM 2012: (Obstructive + Central + Mixed + Hypopnea ≥10s) / therapy hours*").unwrap();
writeln!(out, "*AHI is computed over recorded therapy time only, excluding mask-off gaps between sessions*").unwrap();
writeln!(out, "*Generated by Tidal · not a medical device*").unwrap();
out
}
fn write_event_timeline_md(out: &mut String, session: &Session) {
writeln!(out, "### Events: {} ({})", session.id.0, session.started_at.format("%H:%M")).unwrap();
let clinical: Vec<_> = session.events.iter()
.filter(|e| e.event_type.is_clinical())
.collect();
writeln!(out, "### Events: {} ({}) — {} clinical events", session.id.0, session.started_at.format("%H:%M"), clinical.len()).unwrap();
writeln!(out).unwrap();
writeln!(out, "| Offset | Duration | Type | Strength | Pressure |").unwrap();
writeln!(out, "|-------:|---------:|------|----------|--------:|").unwrap();
for event in &session.events {
for event in &clinical {
let offset_s = event.end_offset_cs as f32 / 100.0;
let dur_s = event.duration_cs as f32 / 100.0;
let strength = event.strength.map(|s| format!("{}", s)).unwrap_or_default();
@@ -208,6 +217,19 @@ struct JsonSession {
duration_secs: u32,
summary: JsonSummary,
events: Vec<JsonEvent>,
#[serde(skip_serializing_if = "Option::is_none")]
signals: Option<Vec<JsonSignalSummary>>,
}
#[derive(Serialize)]
struct JsonSignalSummary {
label: String,
unit: String,
sample_rate_hz: f32,
sample_count: usize,
min: f32,
max: f32,
mean: f32,
}
#[derive(Serialize)]
@@ -234,7 +256,7 @@ pub fn render_json(ctx: &ExportContext) -> String {
from: ctx.from.map(|d| d.to_rfc3339()),
to: ctx.to.map(|d| d.to_rfc3339()),
},
methodology: "AASM 2012: (Obstructive + Central + Mixed + Hypopnea) / therapy hours".into(),
methodology: "AASM 2012: (Obstructive + Central + Mixed + Hypopnea ≥10s) / therapy hours. Therapy time only, excludes mask-off gaps.".into(),
periods: ctx.periods.iter().map(|group| {
let ns = analysis::summarise_period(group);
JsonPeriod {
@@ -255,7 +277,9 @@ pub fn render_json(ctx: &ExportContext) -> String {
mixed_count: s.mixed_count,
hypopnea_count: s.hypopnea_count,
},
events: session.events.iter().map(|e| JsonEvent {
events: session.events.iter()
.filter(|e| e.event_type.is_clinical())
.map(|e| JsonEvent {
end_offset_cs: e.end_offset_cs,
duration_cs: e.duration_cs,
event_type: e.event_type.to_string(),
@@ -263,6 +287,25 @@ pub fn render_json(ctx: &ExportContext) -> String {
pressure_hpa: e.pressure_hpa,
source: e.source.to_string(),
}).collect(),
signals: session.signals.as_ref().map(|block| {
block.channels.iter().map(|ch| {
let (min, max, mean) = if ch.samples.is_empty() {
(0.0, 0.0, 0.0)
} else {
let min = ch.samples.iter().cloned().fold(f32::INFINITY, f32::min);
let max = ch.samples.iter().cloned().fold(f32::NEG_INFINITY, f32::max);
let mean = ch.samples.iter().sum::<f32>() / ch.samples.len() as f32;
(min, max, mean)
};
JsonSignalSummary {
label: ch.label.to_string(),
unit: ch.unit.clone(),
sample_rate_hz: ch.sample_rate_hz,
sample_count: ch.samples.len(),
min, max, mean,
}
}).collect()
}),
}
}).collect(),
}
@@ -294,7 +337,7 @@ pub fn render_csv(ctx: &ExportContext) -> String {
for group in ctx.periods {
for session in &group.sessions {
for event in &session.events {
for event in session.events.iter().filter(|e| e.event_type.is_clinical()) {
let offset_s = event.end_offset_cs as f32 / 100.0;
let dur_s = event.duration_cs as f32 / 100.0;
let strength = event.strength.map(|s| s.to_string()).unwrap_or_default();

View File

@@ -43,6 +43,9 @@ enum Command {
/// Show individual events
#[arg(long)]
events: bool,
/// Show signal channel summaries
#[arg(long)]
signals: bool,
},
/// Export therapy data grouped by therapy period
Export {
@@ -61,6 +64,9 @@ enum Command {
/// Include individual events in text/markdown output
#[arg(long)]
events: bool,
/// Include signal channel summaries in output
#[arg(long)]
signals: bool,
},
}
@@ -213,8 +219,8 @@ fn main() -> Result<()> {
}
Ok(())
}
Command::Session { id, events: show_events } => {
let session = store.get_session(&default_user_id, &id, false)
Command::Session { id, events: show_events, signals: show_signals } => {
let session = store.get_session(&default_user_id, &id, show_signals)
.map_err(|e| anyhow::anyhow!("{}", e))?;
let session = match session {
@@ -265,21 +271,25 @@ fn main() -> Result<()> {
println!(" Hypopnea: {}", summary.hypopnea_count);
println!();
// Event counts by type
// Event counts by type (clinical only in summary)
use std::collections::BTreeMap;
let clinical_events: Vec<_> = session.events.iter()
.filter(|e| e.event_type.is_clinical())
.collect();
let mut counts: BTreeMap<String, usize> = BTreeMap::new();
for event in &session.events {
for event in &clinical_events {
*counts.entry(event.event_type.to_string()).or_default() += 1;
}
println!("Events {} total", session.events.len());
println!("Events {} clinical ({} total incl. device lifecycle)",
clinical_events.len(), session.events.len());
for (et, count) in &counts {
println!(" {:<20} {}", et, count);
}
// Event timeline
// Event timeline (--events shows all events including device lifecycle)
if show_events {
println!();
println!("Timeline");
println!("Timeline ({} events)", session.events.len());
println!(" {:>8} {:>6} {:<20} {:>5} {:>8}", "offset", "dur", "type", "str", "hPa");
println!(" {:>8} {:>6} {:<20} {:>5} {:>8}", "------", "---", "----", "---", "---");
for event in &session.events {
@@ -298,9 +308,34 @@ fn main() -> Result<()> {
}
}
// Signal summary
if show_signals {
if let Some(ref block) = session.signals {
println!();
println!("Signals {} channels", block.channels.len());
println!(" {:<20} {:<8} {:>8} {:>10} {:>10} {:>10} {:>10}",
"Label", "Unit", "Samples", "Rate(Hz)", "Min", "Max", "Mean");
for ch in &block.channels {
let (min, max, mean) = if ch.samples.is_empty() {
(0.0, 0.0, 0.0)
} else {
let min = ch.samples.iter().cloned().fold(f32::INFINITY, f32::min);
let max = ch.samples.iter().cloned().fold(f32::NEG_INFINITY, f32::max);
let mean = ch.samples.iter().sum::<f32>() / ch.samples.len() as f32;
(min, max, mean)
};
println!(" {:<20} {:<8} {:>8} {:>10.3} {:>10.1} {:>10.1} {:>10.1}",
ch.label, ch.unit, ch.samples.len(), ch.sample_rate_hz, min, max, mean);
}
} else {
println!();
println!("No signal data (use 'tidal import' to reimport with signal decoding)");
}
}
Ok(())
}
Command::Export { from, to, user_id: explicit_user_id, format, events: show_events } => {
Command::Export { from, to, user_id: explicit_user_id, format, events: show_events, signals: show_signals } => {
let export_user_id = match explicit_user_id {
Some(ref id) => {
store.ensure_user(id, None)
@@ -326,7 +361,7 @@ fn main() -> Result<()> {
// Load full sessions
let mut sessions = Vec::new();
for row in &filtered_rows {
if let Some(session) = store.get_session(&export_user_id, &row.id, false)
if let Some(session) = store.get_session(&export_user_id, &row.id, show_signals)
.map_err(|e| anyhow::anyhow!("{}", e))? {
sessions.push(session);
}

View File

@@ -107,6 +107,21 @@ pub enum EventType {
Unknown(u32),
}
impl EventType {
/// Whether this event is clinically relevant (respiratory events,
/// device responses to breathing). Returns false for session lifecycle,
/// device housekeeping, and unknown firmware signals.
pub fn is_clinical(&self) -> bool {
matches!(self,
Self::ObstructiveApnea | Self::CentralApnea | Self::MixedApnea |
Self::Hypopnea | Self::FlowLimitation | Self::Snore | Self::RERA |
Self::PressureIncrease | Self::PressureOptimisation |
Self::LargeLeak | Self::MaskOff | Self::PressureChange |
Self::TherapyPause
)
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum EventSource {
DeviceReported, // as classified by device firmware
@@ -117,7 +132,6 @@ pub enum EventSource {
/// All signals share a common time base from session start.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SignalBlock {
pub sample_rate_hz: f32,
pub channels: Vec<SignalChannel>,
}
@@ -125,6 +139,7 @@ pub struct SignalBlock {
pub struct SignalChannel {
pub label: SignalLabel,
pub unit: String,
pub sample_rate_hz: f32,
pub physical_min: f32,
pub physical_max: f32,
pub samples: Vec<f32>, // scaled to physical units
@@ -143,9 +158,14 @@ pub enum SignalLabel {
HeartRate,
ObstructLevel,
EPAPTarget,
EEPAPTarget,
IPAPTarget,
InspExpirRatio,
MVFluctuation,
AMV,
SPRStatus,
TotalLeakage,
RSBI,
Unknown(String),
}
@@ -293,9 +313,14 @@ impl fmt::Display for SignalLabel {
Self::HeartRate => f.write_str("HeartRate"),
Self::ObstructLevel => f.write_str("ObstructLevel"),
Self::EPAPTarget => f.write_str("EPAPTarget"),
Self::EEPAPTarget => f.write_str("EEPAPTarget"),
Self::IPAPTarget => f.write_str("IPAPTarget"),
Self::InspExpirRatio => f.write_str("InspExpirRatio"),
Self::MVFluctuation => f.write_str("MVFluctuation"),
Self::AMV => f.write_str("AMV"),
Self::SPRStatus => f.write_str("SPRStatus"),
Self::TotalLeakage => f.write_str("TotalLeakage"),
Self::RSBI => f.write_str("RSBI"),
Self::Unknown(s) => write!(f, "Unknown({})", s),
}
}
@@ -316,9 +341,14 @@ impl FromStr for SignalLabel {
"HeartRate" => Self::HeartRate,
"ObstructLevel" => Self::ObstructLevel,
"EPAPTarget" => Self::EPAPTarget,
"EEPAPTarget" => Self::EEPAPTarget,
"IPAPTarget" => Self::IPAPTarget,
"InspExpirRatio" => Self::InspExpirRatio,
"MVFluctuation" => Self::MVFluctuation,
"AMV" => Self::AMV,
"SPRStatus" => Self::SPRStatus,
"TotalLeakage" => Self::TotalLeakage,
"RSBI" => Self::RSBI,
other => Self::Unknown(
other.strip_prefix("Unknown(").and_then(|r| r.strip_suffix(')'))
.unwrap_or(other).to_owned()

View File

@@ -5,44 +5,8 @@
//! Pressure values use x100 scaling (1600 = 16.0 hPa).
use anyhow::Result;
use quick_xml::de::from_str;
use serde::Deserialize;
use tidal_core::entities::{Event, EventType, EventSource, TherapyMode, TherapySettings};
#[derive(Debug, Deserialize)]
struct EventFile {
#[serde(rename = "DeviceEvent", default)]
device_events: Vec<RawDeviceEvent>,
#[serde(rename = "RespEvent", default)]
resp_events: Vec<RawRespEvent>,
}
#[derive(Debug, Deserialize)]
struct RawDeviceEvent {
#[serde(rename = "DeviceEventID")]
event_id: u32,
#[serde(rename = "Time")]
time: u32,
#[serde(rename = "ParameterID")]
parameter_id: u32,
#[serde(rename = "NewValue")]
new_value: i64,
}
#[derive(Debug, Deserialize)]
struct RawRespEvent {
#[serde(rename = "RespEventID")]
event_id: u32,
#[serde(rename = "EndTime")]
end_time: u32,
#[serde(rename = "Duration")]
duration: u32,
#[serde(rename = "Pressure")]
pressure: u32,
#[serde(rename = "Strength")]
strength: u8,
}
/// Mapped RespEventID values confirmed from reverse engineering of
/// prisma SMART (WM 100 TD) SD card event XML across 53 sessions.
///
@@ -128,13 +92,6 @@ pub struct ParsedEvents {
}
pub fn parse(xml: &str) -> Result<ParsedEvents> {
// quick-xml doesn't handle mixed element types in sequence well
// so we parse the raw XML manually for the desc root
let wrapped = format!("<root>{}</root>",
xml.trim_start_matches("<?xml version=\"1.0\" encoding=\"utf-8\"?>")
.trim_start_matches("<desc>")
.trim_end_matches("</desc>"));
let settings = extract_settings(xml);
let mut events = Vec::new();
@@ -160,7 +117,14 @@ fn parse_resp_event_line(line: &str) -> Option<Event> {
let pressure = attr_u32(line, "Pressure")?;
let strength = attr_u8(line, "Strength")?;
let event_type = map_resp_event_id(id)?; // None means skip
let mut event_type = map_resp_event_id(id)?; // None means skip
// AASM 2012: hypopnea requires ≥ 10 seconds (1000 centiseconds).
// Sub-threshold events from IDs 106/111/151 are reclassified as
// flow limitation rather than dropped — the user owns all their data.
if event_type == EventType::Hypopnea && duration < 1000 {
event_type = EventType::FlowLimitation;
}
// Pressure uses x100 scaling: 100 = 1.0 hPa
let pressure_hpa = if pressure > 0 {

View File

@@ -1,64 +1,191 @@
//! Parser for Löwenstein .wmedf signal files (EDF variant).
//!
//! Header confirmed as standard EDF ASCII (256 bytes base) with
//! 18 signal channels: Pressure, EEPAPsoll, IPAPsoll, EPAPsoll,
//! RespFlow, rAMV, BreathVolume, BreathFrequency, LeakFlowBreath,
//! ObstructLevel, SpO2, HeartFrequency, SPRstatus, InspExpirRel,
//! MV, rMVFluctuation (+ 2 additional).
//!
//! Session start confirmed at byte offset 0x68: "Wed, 25.03.2026 23:32:42"
//! Standard EDF ASCII header with 18 signal channels. Data records are
//! 70 bytes (35 × i16 LE), each covering 18 seconds. The header's
//! num_data_records field is unreliable (always "1"); actual record
//! count is derived from file size.
use anyhow::Result;
use chrono::{DateTime, Utc, TimeZone, NaiveDateTime};
use anyhow::{Context, Result};
use chrono::{DateTime, NaiveDateTime, TimeZone, Utc};
use std::path::Path;
use tidal_core::entities::{SignalBlock, SignalChannel, SignalLabel};
const GLOBAL_HEADER: usize = 256;
const SIGNAL_HEADER: usize = 256;
pub struct WmedfFile {
pub started_at: DateTime<Utc>,
pub signals: SignalBlock,
}
pub fn parse(path: &Path) -> Result<WmedfFile> {
let data = std::fs::read(path)?;
// EDF header: fixed 256 byte global header + 256 bytes per signal
let header = std::str::from_utf8(&data[..256.min(data.len())])?;
let started_at = parse_start_time(header)?;
let num_signals = parse_field(header, 236, 4)
.and_then(|s| s.trim().parse::<usize>().ok())
.unwrap_or(18);
struct EdfSignalHeader {
label: String,
unit: String,
phys_min: f32,
phys_max: f32,
dig_min: f32,
dig_max: f32,
samples_per_record: usize,
}
pub fn parse(path: &Path) -> Result<WmedfFile> {
let data = std::fs::read(path)
.with_context(|| format!("Failed to read {:?}", path))?;
let global = std::str::from_utf8(&data[..GLOBAL_HEADER.min(data.len())])
.context("Invalid UTF-8 in EDF global header")?;
let started_at = parse_start_time(global)?;
let header_bytes = parse_ascii_field(global, 184, 8)
.and_then(|s| s.trim().parse::<usize>().ok())
.unwrap_or(GLOBAL_HEADER);
let num_signals = (header_bytes - GLOBAL_HEADER) / SIGNAL_HEADER;
if data.len() < header_bytes {
anyhow::bail!("File too small for declared header size");
}
let record_duration_s = parse_ascii_field(global, 248, 8)
.and_then(|s| s.trim().parse::<f32>().ok())
.unwrap_or(18.0);
let sig_headers = parse_signal_headers(&data, num_signals)?;
let samples_per_record: usize = sig_headers.iter().map(|h| h.samples_per_record).sum();
let record_bytes = samples_per_record * 2; // i16 = 2 bytes
let data_start = header_bytes;
let num_records = (data.len() - data_start) / record_bytes;
let mut channel_samples: Vec<Vec<f32>> = sig_headers.iter()
.map(|h| Vec::with_capacity(h.samples_per_record * num_records))
.collect();
// Parse data records
for record_idx in 0..num_records {
let record_offset = data_start + record_idx * record_bytes;
let mut sample_offset = record_offset;
for (ch_idx, header) in sig_headers.iter().enumerate() {
let scale = if (header.dig_max - header.dig_min).abs() > 0.0 {
(header.phys_max - header.phys_min) / (header.dig_max - header.dig_min)
} else {
1.0
};
for _ in 0..header.samples_per_record {
if sample_offset + 2 > data.len() { break; }
let raw = i16::from_le_bytes([data[sample_offset], data[sample_offset + 1]]);
// Clamp to declared digital range — channels with 8-bit ranges
// (dig_min=-128, dig_max=127) store values in 16-bit words and
// out-of-range values are device artefacts.
let clamped = (raw as f32).clamp(header.dig_min, header.dig_max);
let physical = header.phys_min + (clamped - header.dig_min) * scale;
channel_samples[ch_idx].push(physical);
sample_offset += 2;
}
}
}
// Build signal channels
let channels: Vec<SignalChannel> = sig_headers.iter().zip(channel_samples.into_iter())
.map(|(header, samples)| {
let sample_rate_hz = header.samples_per_record as f32 / record_duration_s;
SignalChannel {
label: edf_label_to_signal_label(&header.label),
unit: header.unit.clone(),
sample_rate_hz,
physical_min: header.phys_min,
physical_max: header.phys_max,
samples,
}
})
.collect();
// TODO: parse signal headers and data records
// Signal headers start at byte 256, each 256 bytes
// Data records follow at 256 + (num_signals * 256)
Ok(WmedfFile {
started_at,
signals: SignalBlock {
sample_rate_hz: 1.0, // placeholder
channels: vec![], // TODO: parse signal data
},
signals: SignalBlock { channels },
})
}
fn parse_signal_headers(data: &[u8], num_signals: usize) -> Result<Vec<EdfSignalHeader>> {
// EDF interleaved layout: all labels, then all transducers, then all units, etc.
let header_str = std::str::from_utf8(&data[..GLOBAL_HEADER + num_signals * SIGNAL_HEADER])
.context("Invalid UTF-8 in EDF signal headers")?;
let mut headers = Vec::with_capacity(num_signals);
// Field offsets and sizes per the EDF specification
let label_offset = GLOBAL_HEADER; // 16 bytes each
let unit_offset = label_offset + num_signals * 16 + num_signals * 80; // after transducers
let phys_min_offset = unit_offset + num_signals * 8;
let phys_max_offset = phys_min_offset + num_signals * 8;
let dig_min_offset = phys_max_offset + num_signals * 8;
let dig_max_offset = dig_min_offset + num_signals * 8;
let spr_offset = dig_max_offset + num_signals * 8 + num_signals * 80; // after prefiltering
for i in 0..num_signals {
let label = header_str[label_offset + i * 16..label_offset + (i + 1) * 16].trim().to_owned();
let unit = header_str[unit_offset + i * 8..unit_offset + (i + 1) * 8].trim().to_owned();
let phys_min = header_str[phys_min_offset + i * 8..phys_min_offset + (i + 1) * 8]
.trim().parse::<f32>().unwrap_or(0.0);
let phys_max = header_str[phys_max_offset + i * 8..phys_max_offset + (i + 1) * 8]
.trim().parse::<f32>().unwrap_or(0.0);
let dig_min = header_str[dig_min_offset + i * 8..dig_min_offset + (i + 1) * 8]
.trim().parse::<f32>().unwrap_or(0.0);
let dig_max = header_str[dig_max_offset + i * 8..dig_max_offset + (i + 1) * 8]
.trim().parse::<f32>().unwrap_or(0.0);
let samples_per_record = header_str[spr_offset + i * 8..spr_offset + (i + 1) * 8]
.trim().parse::<usize>().unwrap_or(1);
headers.push(EdfSignalHeader {
label,
unit,
phys_min,
phys_max,
dig_min,
dig_max,
samples_per_record,
});
}
Ok(headers)
}
fn edf_label_to_signal_label(label: &str) -> SignalLabel {
match label {
"Pressure" => SignalLabel::Pressure,
"EEPAPsoll" => SignalLabel::EEPAPTarget,
"IPAPsoll" => SignalLabel::IPAPTarget,
"EPAPsoll" => SignalLabel::EPAPTarget,
"RespFlow" => SignalLabel::RespiratoryFlow,
"rAMV" => SignalLabel::AMV,
"BreathVolume" => SignalLabel::BreathVolume,
"BreathFrequency" => SignalLabel::BreathFrequency,
"LeakFlowBreath" => SignalLabel::LeakFlow,
"ObstructLevel" => SignalLabel::ObstructLevel,
"SpO2" => SignalLabel::SpO2,
"HeartFrequency" => SignalLabel::HeartRate,
"SPRstatus" => SignalLabel::SPRStatus,
"InspExpirRel" => SignalLabel::InspExpirRatio,
"MV" => SignalLabel::MinuteVolume,
"rMVFluctuation" => SignalLabel::MVFluctuation,
"TotalLeakage" => SignalLabel::TotalLeakage,
"RSBI" => SignalLabel::RSBI,
other => SignalLabel::Unknown(other.to_owned()),
}
}
fn parse_start_time(header: &str) -> Result<DateTime<Utc>> {
// EDF standard: date at offset 168 (8 chars "dd.MM.yy")
// time at offset 176 (8 chars "HH.mm.ss")
// Löwenstein uses '.' as separator confirmed from hex dump
let date_str = header.get(168..176).unwrap_or("").trim();
let time_str = header.get(176..184).unwrap_or("").trim();
// Also present in human-readable form at offset 0x58:
// "Recording start at Wed, 25.03.2026 23:32:42"
// Human-readable form at offset ~0x58: "Recording start at Wed, 25.03.2026 23:32:42"
if let Some(pos) = header.find("Recording start at ") {
let rest = &header[pos + 19..];
// Parse "Wed, 25.03.2026 23:32:42"
if let Some(dt_str) = rest.split_whitespace()
.skip(1) // skip "Wed,"
.collect::<Vec<_>>()
.first()
.and_then(|date| rest.find(date).map(|i| &rest[i..i+19]))
.and_then(|date| rest.find(date).map(|i| &rest[i..i + 19]))
{
if let Ok(ndt) = NaiveDateTime::parse_from_str(dt_str, "%d.%m.%Y %H:%M:%S") {
return Ok(Utc.from_utc_datetime(&ndt));
@@ -69,6 +196,6 @@ fn parse_start_time(header: &str) -> Result<DateTime<Utc>> {
anyhow::bail!("Could not parse session start time from wmedf header")
}
fn parse_field(header: &str, offset: usize, len: usize) -> Option<&str> {
fn parse_ascii_field(header: &str, offset: usize, len: usize) -> Option<&str> {
header.get(offset..offset + len)
}

View File

@@ -106,7 +106,7 @@ impl SqliteStore {
sig_stmt.execute(params![
session.id.0,
user_id,
block.sample_rate_hz,
ch.sample_rate_hz,
ch.label.to_string(),
ch.unit,
ch.physical_min,
@@ -289,7 +289,7 @@ impl SessionStore for SqliteStore {
WHERE session_id = ?1 AND user_id = ?2",
)?;
let channels: Vec<(f32, SignalChannel)> = sig_stmt.query_map(
let channels: Vec<SignalChannel> = sig_stmt.query_map(
params![session_id, user_id],
|row| {
Ok((
@@ -304,23 +304,20 @@ impl SessionStore for SqliteStore {
)?.map(|r| {
let (rate, label_str, unit, pmin, pmax, blob) = r?;
let label: SignalLabel = label_str.parse().unwrap();
Ok((rate, SignalChannel {
Ok(SignalChannel {
label,
unit,
sample_rate_hz: rate,
physical_min: pmin,
physical_max: pmax,
samples: blob_to_samples(&blob),
}))
})
}).collect::<Result<Vec<_>>>()?;
if channels.is_empty() {
None
} else {
let sample_rate_hz = channels[0].0;
Some(SignalBlock {
sample_rate_hz,
channels: channels.into_iter().map(|(_, ch)| ch).collect(),
})
Some(SignalBlock { channels })
}
} else {
None
@@ -585,10 +582,10 @@ mod tests {
let mut session = test_session("300306-003336");
session.signals = Some(SignalBlock {
sample_rate_hz: 25.0,
channels: vec![SignalChannel {
label: SignalLabel::Pressure,
unit: "hPa".into(),
sample_rate_hz: 25.0,
physical_min: 0.0,
physical_max: 25.0,
samples: vec![4.0, 4.5, 5.0, 5.5, 6.0],
@@ -599,8 +596,8 @@ mod tests {
let loaded = store.get_session("user-a", "300306-003336", true).unwrap().unwrap();
let signals = loaded.signals.unwrap();
assert_eq!(signals.sample_rate_hz, 25.0);
assert_eq!(signals.channels.len(), 1);
assert_eq!(signals.channels[0].sample_rate_hz, 25.0);
assert_eq!(signals.channels[0].samples, vec![4.0, 4.5, 5.0, 5.5, 6.0]);
assert_eq!(signals.channels[0].label.to_string(), "Pressure");
}

68
doc/plan/signals.md Normal file
View File

@@ -0,0 +1,68 @@
# Plan: Decode wmedf signal data, expose via CLI and JSON export
## Context
The .wmedf parser currently extracts only the session start timestamp. The binary signal data (18 channels of continuous waveforms — pressure, respiratory flow, SpO2, heart rate, etc.) is ignored. This data is clinically critical: it enables visual confirmation of apnea types, desaturation tracking, and cardiac correlation.
## Binary format summary
- Standard EDF, 18 channels, 4864-byte header (256 global + 18×256 signal)
- Signal headers use EDF interleaved layout (all labels, then all transducers, etc.)
- Data records: 70 bytes each (35 × i16 LE), 18 seconds per record
- Samples per record varies by channel: Pressure(5), RespFlow(10), SPRstatus(5), all others(1) = 35 total
- Num records: derive from `(file_size - 4864) / 70` (header field unreliable)
- Conversion: `physical = phys_min + (raw - dig_min) * (phys_max - phys_min) / (dig_max - dig_min)`
## Files to modify
- `crates/tidal-core/src/entities.rs` — move `sample_rate_hz` to `SignalChannel`, add new `SignalLabel` variants
- `crates/tidal-devices/src/lowenstein/wmedf.rs` — full signal header + data record decoding
- `crates/tidal-store/src/sqlite.rs` — use per-channel sample rate in read/write paths
- `crates/tidal-cli/src/main.rs` — add `--signals` flag to `session` command
- `crates/tidal-cli/src/export.rs` — add signal summaries to JSON export
## Step 1: Update entities
Move `sample_rate_hz` from `SignalBlock` to `SignalChannel`. Add `SignalLabel` variants: `EEPAPTarget`, `TotalLeakage`, `RSBI`, `AMV`, `SPRStatus`. Update `Display`/`FromStr`.
## Step 2: Update tidal-store for per-channel sample rate
- Write path: use `ch.sample_rate_hz` instead of `block.sample_rate_hz`
- Read path: store rate into each `SignalChannel`, construct `SignalBlock { channels }` without block-level rate
- Update tests
## Step 3: Implement wmedf signal decoder
In `wmedf.rs`:
1. Parse signal headers from the interleaved EDF layout (labels at 256, units at 1984, phys min/max, dig min/max, samples_per_record at 4144)
2. Map EDF label strings to `SignalLabel` variants (e.g. "EEPAPsoll" → `EEPAPTarget`, "HeartFrequency" → `HeartRate`)
3. Parse data records: iterate 70-byte chunks, read i16 LE values, distribute across channels by samples_per_record
4. Convert digital → physical values per the EDF formula
5. Compute per-channel sample_rate_hz: `samples_per_record / 18.0` (record duration)
6. Build `SignalBlock` with populated `SignalChannel` entries
## Step 4: Add `--signals` to CLI session command
When `--signals` is set, load session with signals and display summary table:
```
Signals 18 channels
Label Unit Samples Rate(Hz) Min Max Mean
Pressure hPa 3630 0.278 6.0 12.3 8.4
RespiratoryFlow l/min 7260 0.556 -42.0 55.3 1.2
SpO2 % 726 0.056 88.0 97.0 93.2
...
```
## Step 5: Add signal summaries to JSON export
Add `--signals` flag to export command. When set, load sessions with signals and include per-channel summary (label, unit, sample_rate_hz, sample_count, min, max, mean) in JSON output. Raw samples excluded from JSON — too large.
## Verification
1. `cargo build` — no errors or warnings
2. `cargo test` — all tests pass (including updated signal_blob_round_trip test)
3. `rm ~/.local/share/tidal/tidal.db && tidal import ~/lowenstein/therapy_extracted --from 2026-03-26 --to 2026-03-28` — imports with signals
4. `tidal session 300306-003344 --signals` — shows 18 channels with plausible values
5. `tidal export --format json --from 2026-03-26 --to 2026-03-28 --signals` — JSON includes signal summaries
6. Verify: Pressure values in hPa range (4-20), SpO2 in % range (80-100), HeartRate in bpm range (40-120)