On-device ML inference monitoring for Swift (iOS, macOS). Tracks latency, confidence, drift, and hardware metrics without ever sending raw inputs.
| Sample | What it shows |
|---|---|
| Sources | SDK source code |
| iOSAppSample | iOS app integration using SwiftUI |
| SPMExamples | Swift Package examples runnable from the terminal or Xcode |
| OnnxExample | Zero-code ONNX Runtime tracking via auto-interceptor |
| MLKitExample | Zero-code ML Kit tracking via auto-interceptor |
| TFLiteExample | TensorFlow Lite manual inference tracking |
| TracingExample.swift | Multi-step tracing with spans |
To run the examples or use the SDK in your application, you need a WildEdge DSN. A DSN is a single configuration value that contains your Project Key and connection details for the WildEdge API. To get your DSN:
- Navigate to
https://wildedge.dev/and sign up or log in. - Open the dashboard at
https://app.wildedge.dev/. - Create a project (or open an existing project).
- Copy the project DSN for later.
Add the WildEdge Swift SDK dependency to your Package.swift file:
dependencies: [
.package(url: "https://github.com/wild-edge/wildedge-swift.git", branch: "main")
],
targets: [
.target(
name: "MyApp",
dependencies: [
.product(name: "WildEdge", package: "wildedge-swift")
]
)
]- Open your project in Xcode.
- Go to
File > Add Package Dependencies.... - Add: https://github.com/wild-edge/wildedge-swift.git
- Select product
WildEdge.
The SDK initializes itself before main() runs. Provide the DSN via either:
Environment variable (Xcode scheme or process environment):
WILDEDGE_DSN=https://<secret>@ingest.wildedge.dev/<key>
Info.plist (checked as fallback when the env var is absent):
<key>WILDEDGE_DSN</key>
<string>https://<secret>@ingest.wildedge.dev/<key></string>WildEdge.shared is then ready for use anywhere in your app. Set WILDEDGE_DEBUG=true (env var) to see verbose auto-init and event logs.
Call WildEdge.initialize explicitly when you need programmatic control (e.g. reading the DSN from a config file):
import WildEdge
let wildEdge: WildEdgeClient = WildEdge.initialize { builder in
builder.dsn = "https://<secret>@ingest.wildedge.dev/<key>"
// builder.debug = true
}For iOS, call this at AppDelegate.application(_:didFinishLaunchingWithOptions:). If no DSN is set, WildEdge.initialize returns a no-op client.
The SDK automatically intercepts ORTSession creation and run calls at the ObjC runtime level — no import or code changes needed. Just run your existing ONNX code and events appear in the dashboard:
import OnnxRuntimeBindings
// No WildEdge calls required.
let env = try ORTEnv(loggingLevel: .warning)
let session = try ORTSession(env: env, modelPath: modelPath, sessionOptions: nil)
let outputs = try session.run(withInputs: inputs, outputNames: ["output"], runOptions: nil)
// load, inference, and unload are tracked automatically.All built-in ML Kit detectors (Face, Object, Image Labeler, Text Recognizer, Barcode Scanner, Pose) are intercepted automatically. Remote model downloads via ModelManager are also tracked:
import MLKitFaceDetection
import MLKitVision
// No WildEdge calls required.
let detector = FaceDetector.faceDetector(options: options)
// ↑ trackLoad fires automatically
detector.process(VisionImage(image: image)) { faces, error in
// ↑ trackInference fires automatically
}TFLite does not have an ObjC-accessible runtime layer that WildEdge can hook, so use explicit model handles:
import WildEdge
import TensorFlowLite
let handle = wildEdge.registerModel(
modelId: "mobilenet_v3_int8_cpu",
info: ModelInfo(
modelName: "MobileNet V3",
modelVersion: "3.0",
modelSource: "local",
modelFormat: "tflite",
quantization: "int8"
)
)
handle.trackLoad(durationMs: 60, accelerator: .cpu)
let start = Date()
try interpreter.invoke()
_ = handle.trackInference(
durationMs: Int(Date().timeIntervalSince(start) * 1000),
inputModality: .image,
outputModality: .classification
)You can group inferences into a named trace using wildEdge.trace:
wildEdge.trace("user-query") { trace in
let embedding = trace.span("embed") { _ in
let start = Date()
let vector = embedModel.run(input)
_ = embedHandle.trackInference(durationMs: Int(Date().timeIntervalSince(start) * 1000))
return vector
}
trace.span("classify") { _ in
let start = Date()
_ = classifyModel.run(embedding)
_ = classifyHandle.trackInference(durationMs: Int(Date().timeIntervalSince(start) * 1000))
}
}trace {}creates a root span.span {}creates child spans with parent linkage.trackInference()inside trace/span inherits correlation context.
_ = handle.trackInference(
durationMs: 34,
outputMeta: DetectionOutputMeta(numPredictions: result.count, avgConfidence: 0.91).toMap()
)Available metadata types: DetectionOutputMeta, GenerationOutputMeta, EmbeddingOutputMeta, TextInputMeta.
| Parameter | Default | Description |
|---|---|---|
dsn |
nil |
https://<secret>@ingest.wildedge.dev/<key> (or WILDEDGE_DSN env var / Info.plist key) |
appVersion |
auto-detected | App version attached to every batch |
batchSize |
10 |
Events per request |
maxQueueSize |
200 |
Max in-memory events |
flushIntervalMs |
60_000 |
Background flush interval |
maxEventAgeMs |
900_000 |
Old events are dropped |
lowConfidenceThreshold |
0.5 |
Sampling threshold |
debug |
false |
Verbose logs (or WILDEDGE_DEBUG=true) |
Paste this prompt into your coding agent:
Integrate the WildEdge Swift SDK into this project.
1. Initialize WildEdge once at app startup:
- Preferred: set WILDEDGE_DSN env var — the SDK auto-inits before main().
- Alternative: call WildEdge.initialize { $0.dsn = "YOUR_DSN" } at app launch.
2. ONNX Runtime and ML Kit are intercepted automatically — no code changes needed for those.
3. For all other ML inference code (TensorFlow Lite, Core ML, remote LLM APIs):
- Register a stable model handle with wildEdge.registerModel(modelId:info:)
- Add trackLoad()/trackUnload() for model lifecycle when applicable
- Time inference and call trackInference(...)
- Add success/failure tracking with errorCode on failures
4. Wrap multi-step pipelines in wildEdge.trace("name") { }.
5. Add lifecycle hooks for flush() and close().
6. Send only metadata (WildEdge.analyzeText(...).toMap(), outputMeta maps), never raw inputs.
- Xcode 15+ (recommended)
- Swift 5.9+
- iOS 13+ / macOS 12+
swift testswift test --filter DsnParserTestsswift build -c releasecd Examples/SPMExamples
swift runWildEdgehas no required transitive runtime dependencies- External ML frameworks are integrated by your app (TensorFlow Lite, ONNX Runtime, ML Kit, Core ML)
- Package.swift: Swift SDK package manifest
- Sources: SDK source code
- Tests: SDK test suite
- Examples: iOS app and SwiftPM examples