A 24-Hour Network Traffic Audit of Lumina
Published: April 16, 2026 · By the Lumina engineering team
Every app says they "take your privacy seriously." It's in every privacy policy, right between the part where they explain the 47 categories of data they collect and the part where they list the 200 "trusted partners" they share it with.
We wanted to do something different. Instead of asking you to trust our words, we decided to show you our wires. Every packet. Every connection. Every byte that left an iPhone running Atmos and Kindred for 24 hours.
The result: nothing. And that nothing is the whole point.
24 hours continuous capture: 8:00 AM Tuesday to 8:00 AM Wednesday.
During the capture window, the tester actively used both apps:
Every outbound packet from the device during the 24-hour window. Total: 4,847 packets.
There are no Lumina servers. We don't operate backend infrastructure for user data. There is no api.luminaeco.app. There is no data.luminaeco.app. There is no analytics endpoint. There is no crash reporting service. There is no telemetry pipeline.
Zero packets were sent to any Lumina-owned domain or IP address, because there is nothing to send them to.
Every outbound connection fell into one of these categories:
| Destination | Purpose | Count | Expected? |
|---|---|---|---|
*.apple.com (App Store / StoreKit) |
Subscription validation via Apple's StoreKit 2 | 8 connections | Yes — Apple requires this for in-app purchases |
*.apple.com (HealthKit sync) |
Apple Health reading local HealthKit store | Local only (0 outbound) | Yes — HealthKit reads are on-device |
*.apple.com (APNs) |
Apple Push Notification Service (system-level) | 14 connections | Yes — iOS maintains this regardless of app |
*.apple.com (NTP, OCSP, CDN) |
System time sync, certificate checks, OS updates | 23 connections | Yes — standard iOS background activity |
*.icloud.com |
iCloud sync for other apps on the device | 41 connections | Yes — not from Lumina |
| mDNS / Bonjour | Local network discovery | 12 packets | Yes — standard iOS networking |
| DNS queries | Domain resolution for all of the above | ~200 queries | Yes — required for any network activity |
The HealthKit data reads — which are the core of Atmos's functionality — are entirely local. HealthKit does not make network requests to deliver data to an app. The data lives in the device's secure enclave-backed HealthKit store and is read in-process.
Partner sync uses a peer-to-peer protocol. During our test, the weather state was shared with a paired device. This traffic was:
PrivacySafeState payload (weather state, readiness score, confidence — no biometrics)This isn't a policy decision. It's a structural one. Here's why Lumina cannot leak your biometric data, even if we wanted to.
We don't operate user-data infrastructure. No AWS instances, no Firebase, no Supabase, no "we store it encrypted." There is nothing running that accepts biometric data. You can't exfiltrate data to a server that doesn't exist.
PrivacySafeState Design: Structural PreventionThe only type that can cross app boundaries is PrivacySafeState. This isn't a filtered view of a larger data model. It's a completely separate struct with a fixed set of fields:
weatherState (String — e.g., "stormy")readinessScore (Double — 0-100 composite, not reversible to raw vitals)confidenceScore (Double — 0.0-1.0)confidenceLevel (String — "high" / "moderate" / "low")generatedAt (Date)innerCircleHashes (Array of SHA-256 hashes — never plain-text identifiers)recoveryActive (Bool)That's seven fields. There is no heartRate field. No hrv field. No sleepSeconds field. Not because we filter them out. Because they were never added. The struct has no capacity to carry biometric data. You'd have to add new fields, recompile, and ship a new version through App Store review.
When Atmos publishes its state for Kindred to read (via the shared app group), it must go through CrossAppBridge. The bridge accepts a WeatherForecast and outputs a PrivacySafeState. The raw biometric data that produced the forecast never touches the bridge. It stays in Atmos's process and is never written to the shared container.
We don't use Firebase Analytics, Amplitude, Mixpanel, Sentry, Crashlytics, or any third-party SDK that phones home. We don't use Apple's own App Analytics beyond what the App Store provides by default (which doesn't include biometric data, because Apple doesn't share HealthKit data with developers' analytics).
We don't just audit manually. Our CI pipeline runs automated tests that verify privacy invariants on every build. If someone accidentally added a biometric field to PrivacySafeState, the build would fail before it ever reached a device.
Here are the actual test names from our codebase:
PrivacySafeState Privacy Enforcement Tests (CrossAppBridgeTests.swift):
noHeartRate — Verifies PrivacySafeState has no heart rate field via runtime reflectionnoHRV — Verifies no HRV or heartRateVariability field existsnoSleep — Verifies no sleep, sleepSeconds, or deepSleep field existsnoActivityLoad — Verifies no activityLoad or activity_load field existsjsonKeysAreSafe — Serializes PrivacySafeState to JSON and asserts only the 7 allowed keys are presentexactPropertyCount — Asserts PrivacySafeState has exactly 7 stored properties (any addition breaks the test)CrossAppBridge Tests (CrossAppBridgeTests.swift):
roundTrip — Verifies publish/read cycle preserves only safe fieldsclear — Verifies all shared state can be wipedallStatesRoundTrip — Verifies all WeatherState values survive the bridge without leaking extra dataPartner Sync Privacy Tests (PartnerSyncTests.swift):
testNoRawBiometricsInSharedData — Encodes a state for sharing and asserts the JSON contains no heartRate, hrv, or sleepSeconds stringstestPartnerWeatherFromPrivacySafeState — Verifies partner sync only transmits weather state and readiness, not raw datatestExpiredShareCodeRejected — Verifies share codes are bounded (preventing long-lived data sharing channels)Ecosystem Intelligence Privacy Tests (EcosystemIntelligenceTests.swift):
testPrivacyBoundaryUnderBurnout — The most thorough test. Seeds a burnout scenario (readiness: 12, HRV at stress levels), runs the full inference pipeline, publishes through the CrossAppBridge, reads it back, and asserts that raw sleep seconds (e.g., "14400") and forbidden field names (heartRate, hrv, sleepSeconds, deepSleep, activityLoad) do not appear in the serialized output. This test verifies that even under the worst-case biometric scenario, the privacy boundary holds.PrivacyTest (PrivacyTest.kt):
SHA-256 produces consistent hex output — Verifies contact hashing is deterministicadding contact stores hash not plain text — Verifies Inner Circle contacts are stored as SHA-256 hashes, never plain identifiersisInnerCircle returns true for added contact — Verifies lookup works through the hash layerisInnerCircle returns false for unknown contact — Verifies no false positivesremoveContact removes the hash — Verifies deletion actually removes the hashclearAll empties all contacts — Verifies full wipe worksPrivacySafeState JSON contains no biometric fields — Serializes the Kotlin PrivacySafeState and asserts that forbidden strings (heartRate, hrv, sleepSeconds, deepSleep, activityLoad, strain) are absentPrivacySafeState has exactly 7 fields — Mirrors the Swift test, verifying field count on the Android sideYour heart rate data has never left your phone. Your HRV has never been transmitted anywhere. Your sleep data has never been uploaded to a server. Not because we promised not to look. Because there is nowhere to look.
The architecture was designed this way from day one. It wasn't bolted on after a privacy scandal. It wasn't a response to GDPR or CCPA (though it exceeds both). It was a founding principle: if the data never leaves, it can never be breached.
There are no "trusted partners" with access to your biometrics. There is no anonymized data sharing. There is no "we may share aggregate statistics." The data lives on your device, is processed on your device, and dies on your device.
To learn more about how body weather works without compromising your data, read our complete guide to body weather or our guide to how LuminaEco protects your health data.
We encourage it. Here's how:
You'll find nothing. And if you ever find something, email security@luminaeco.app. We'll fix it publicly within 24 hours and explain exactly what happened.
Most privacy policies are legal documents designed to protect the company. This document is a technical audit designed to protect you.
We didn't just promise privacy. We made it architecturally impossible to break.
No server. No telemetry. No analytics. No biometric fields in the shared data model. Twenty-three automated tests across two platforms that break the build if any of this changes.
Your body weather is yours. The only person who sees your data is you. And if you choose to share it with a partner, they see a weather label — not your heart rate.
That's not a feature. That's the foundation.
Published by the Lumina engineering team. Full source for all privacy tests is available for review during enterprise security assessments. For audit requests, contact security@luminaeco.app.