<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Oleksii Oliinyk on Medium]]></title>
        <description><![CDATA[Stories by Oleksii Oliinyk on Medium]]></description>
        <link>https://medium.com/@djalex566?source=rss-da87d9f8912c------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Fri, 24 Apr 2026 05:22:50 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@djalex566/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[How SwiftData Represents AttributedString in Core Data Storage]]></title>
            <link>https://medium.com/@djalex566/how-swiftdata-represents-attributedstring-in-core-data-storage-69036a4f166a?source=rss-da87d9f8912c------2</link>
            <guid isPermaLink="false">https://medium.com/p/69036a4f166a</guid>
            <category><![CDATA[swift]]></category>
            <category><![CDATA[swiftdata]]></category>
            <category><![CDATA[swiftui]]></category>
            <category><![CDATA[sqlite]]></category>
            <category><![CDATA[core-data]]></category>
            <dc:creator><![CDATA[Oleksii Oliinyk]]></dc:creator>
            <pubDate>Wed, 19 Nov 2025 15:13:41 GMT</pubDate>
            <atom:updated>2025-11-19T15:13:41.726Z</atom:updated>
            <content:encoded><![CDATA[<p>This year, Apple presented just a few updates for <strong>SwiftData</strong>. One particular change that initially didn’t catch my attention was the ability to store extra data types in a model, such as <strong>AttributedString</strong>. When I saw this update, I thought they had just provided another mechanism to store these types as <strong>Transformable</strong> or serialized <strong>Data</strong> under the hood, but the reality turned out to be different.</p><p>Since I maintain an <a href="https://apps.apple.com/us/app/datascout-for-swiftdata/id6737813684">app for debugging SwiftData</a> storage, this eventually created a problem for me, as I use Core Data to load data dynamically from storage. Users started complaining about crashes when trying to open a database containing these types. Here is the story of what I discovered while investigating it.</p><h3>The issue</h3><p>Let’s look at the error occurring when we attempt to read the value from a property of this data type:</p><pre>object.value(forKey: &quot;icon&quot;)</pre><p>The result is an exception:</p><pre>NSInvalidArgumentException: No NSCoreDataCodableAdapter for name iconJSONAdapter found; (user info absent)</pre><h3><strong>Initial Data Type Check</strong></h3><p>The first obvious place to start this investigation is to check the type of <strong>NSAttributeDescription</strong> in the storage. I expected this part to be straightforward, as Apple didn’t add any new types to Core Data this year, but the outcome was quite unexpected.</p><pre>(lldb) po attribute.attributeType.rawValue<br>2200</pre><p>The type of the attribute is an enum value represented as the UInt value 2200. Let’s see what that means:</p><pre>// All of the primitive types...<br><br>    @available(macOS 10.6, *)<br>    case objectIDAttributeType = 2000<br><br>    @available(macOS 14.0, *)<br>    case compositeAttributeType = 2100<br>}</pre><p>Wait, what? That doesn’t seem to correspond to any existing data types in Core Data so far. Does this mean that Apple developers extended Core Data privately for SwiftData purposes?</p><p>Let’s move forward and try to find out what this type is used for.</p><h3>Investigating discovered Data Type</h3><p>The first thing I noticed about this new type is that it also has a <strong>valueTransformerName</strong>, just like a transformable attribute. Let’s try to assign a dummy transformer for this type and see what happens.</p><pre>ValueTransformer.setValueTransformer(<br>    DummyTransformer(),<br>    forName: NSValueTransformerName(transformerName)<br>)</pre><p>Aaand… nothing changed. An attempt to read the value from this property still throws the same exception as before.</p><p>Let’s check what other information is available to us in the <strong>NSAttributeDescription</strong>:</p><ol><li>The type of the description seems to be a private class: <strong>NSCodableAttributeDescription</strong>.</li><li>The list of all methods for this type is as follows:</li></ol><ul><li>adapterName</li><li>setAdapterName:</li><li>decode:withRegistry:error:</li><li>encode:withRegistry:error:</li></ul><p>Based on that information and the exception text, we can assume that this new 2200 attribute type is “codable”, which stores codable types in a special way. Previously, Codable properties were either converted automatically into a <strong>composite</strong> type or simply serialized as JSON <strong>binary data</strong>. It seems that with this year’s release, they introduced one more mechanism to store Codable types.</p><h3>Diving into the Details</h3><p>Now, let’s investigate how this new type is decoded. The obvious assumption is that it calls “decode” from the attribute description when we try to access the value. Let’s put a symbolic breakpoint on it and see where we get.</p><pre>-[NSCodableAttributeDescription decode:withRegistry:error:]</pre><p>This leads us into the assembly code of the method implementation, which can be interpreted roughly like this:</p><pre>@objc func decode(_ data: Data,<br>                  withRegistry registry: NSCoreDataCodableAdapterRegistry,<br>                  error outError: NSErrorPointer) -&gt; Any? <br>{<br>    // 1. Get adapter (transformer) name<br>    let adapterName = self.adapterName<br><br>    // Prepare temporary error holder<br>    var localError: NSError? = nil<br><br>    // 2. Call into registry to decode using the adapter<br>    //    This selector is: decodeWithData:withAdapterNamed:error:<br>    let decoded = registry.decodeWithData(data,<br>                                          withAdapterNamed: adapterName,<br>                                          error: &amp;localError)<br><br>    // 3. If decoding failed...<br>    if decoded == nil {<br>        if let err = localError {<br>            // 3.1 Check if error is of private type &quot;__NSCoreDataCodableError&quot;<br>            if err.isKind(of: __NSCoreDataCodableError.self) {<br>                // 3.2 Throw Objective-C exception (CoreData uses this internally)<br>                NSException(name: err.exceptionName,<br>                            reason: err.localizedDescription,<br>                            userInfo: nil).raise()<br>            }<br>        }<br>    }<br><br>    // 4. If caller provided an error pointer, fill it<br>    if let outError = outError {<br>        outError.pointee = localError<br>    }<br><br>    // 5. Return the decoded object<br>    return decoded<br>}</pre><p>Now we can see that the new type is using an extra registry to decode the provided codable data based on the adapter name.</p><p>That’s enough information to provide a fix: I can register a dummy decoder to get the original Data as a result. The app is now able to run the same data detection heuristics as it does for encoded binary data and show at least some data if possible.</p><h3>Conclusion</h3><p>It turns out that even “minor” SwiftData updates can trigger foundational changes in Core Data storage. While my workaround restores functionality for now, reverse engineering private classes always involves some guesswork.</p><p>If you have encountered this attribute type in your own debugging or have a better explanation for how the adapter registry functions, let me know in the discussion below.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=69036a4f166a" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Inspecting SwiftData (or just SQLite) right from your app]]></title>
            <link>https://medium.com/@djalex566/inspecting-swiftdata-right-from-your-app-3e79ef5a6bf6?source=rss-da87d9f8912c------2</link>
            <guid isPermaLink="false">https://medium.com/p/3e79ef5a6bf6</guid>
            <category><![CDATA[core-data]]></category>
            <category><![CDATA[ios-app-development]]></category>
            <category><![CDATA[swift]]></category>
            <category><![CDATA[sqlite]]></category>
            <category><![CDATA[swiftdata]]></category>
            <dc:creator><![CDATA[Oleksii Oliinyk]]></dc:creator>
            <pubDate>Sat, 10 May 2025 14:24:59 GMT</pubDate>
            <atom:updated>2025-05-11T16:11:38.711Z</atom:updated>
            <content:encoded><![CDATA[<p>Over the past six months, I’ve been contributing to my macOS debugging tool, <a href="https://apps.apple.com/us/app/datascout-for-swiftdata/id6737813684">DataScout</a>, to inspect SwiftData stores in real time. While it’s proven useful on the Mac, users often need to peek at their app’s database directly on the device. To solve this, I’ve recompiled DataScout’s core for iOS and wrapped it in a Swift package, <a href="https://github.com/alex566/DataScoutCompanion"><strong>DataScoutCompanion</strong></a><strong>, </strong>that lets you present a browser view anywhere in your SwiftUI app. Below, I explain how to use it for solving one of the most common issues: a proper data cleanup after removing the root of the object’s graph.</p><h3>Example</h3><p>Let’s look at a simple app I started long ago but never finished, and now use as a playground for testing and debugging tools. It’s a Bento-style presentation editor that stores folders with canvases across two databases: local and iCloud.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/798/1*RUh3mLHecn4og21QtYbq2w.png" /></figure><p>To represent each list item and enable cross-device folder sharing while keeping per-device settings (like order and pinned state), I introduced auxiliary “list entry” models in the database.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/434/1*Ec4LR1WjA72N79pLuEwpQw.png" /></figure><p>With the schema in place, I can launch the app and create some test data to verify everything works. I can see when I add or edit items, but it’s often hard to confirm that deleted records are truly removed. For those cases, I need deeper visibility into my data.</p><h3>Debugging the database</h3><p>To make runtime inspection easy, I’ve packaged my app UI as a <a href="https://github.com/alex566/DataScoutCompanion">Swift package</a> you can embed in any app. With just a few lines, you can present a live database inspector wherever it makes sense, for example, from a Debug menu:</p><pre>import SwiftUI<br>import DataScoutCompanion<br><br>struct MyView: View {<br>    @State<br>    var isBrowserPresented: Bool = false<br>    var body: some View {<br>        /*<br>        Your UI implementation is here<br>        */<br>        .sheet(isPresented: $isBrowserPresented) {<br>            DatabaseBrowser()<br>        }<br>    }<br>}</pre><p>And now, when I present the browser, it immediately displays all the database files my app has created:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/820/1*ciJgs_IuiREkyUCeWYE2iw.png" /></figure><p>Here, I can see the two stores that SwiftData generated from my container setup, and I can drill into either one for more details:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/798/1*d7dpmbAWNEQPSA4HcAfs_A.png" /></figure><p>The first section shows every model in my schema, and even lets me preview each model’s structure as a visual schema:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/666/1*cOkmFrHCBXZWdlbQvIb37A.png" /></figure><p>Now let’s jump to the most interesting part: exploring the actual data.</p><p>I can select “PagesFolderListEntry” to see each entry and which folder it belongs to.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*9karB-_18PXdjXQq-mZiWQ.png" /></figure><p>From there, I can tap on the folder and navigate directly to it:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/804/1*mrXhDdf166l9GttbKwdu4Q.png" /></figure><p>Drilling down further, I can view all items stored in that folder:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/798/1*5kB-EP4bHdcJoMA_PfuQEQ.png" /></figure><p>This lets me quickly preview each item and verify that my canvas objects were saved with the correct structure:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/774/1*X4f8xtlfQJk9N1UXtNmdQA.png" /></figure><p>Now for the main experiment: I close the browser, delete one of the folders in the app, then reopen the browser to inspect the database again. Here’s what I see:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/770/1*da2iJUDgkJntdOBhdb82vA.png" /></figure><p>It becomes immediately clear what went wrong: the cascade delete removed the folder and its entries, but left the canvas records orphaned. With this insight, I can now fix my delete rules to ensure all related objects are cleaned up properly.</p><h3>Additional: Raw SQLite Support</h3><p>Beyond SwiftData stores, DataScoutCompanion can inspect any raw SQLite file. If your app uses an SQLite wrapper, like GRDB.swift or FMDB, you can still mount and browse the tables directly in the same UI:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/798/1*N2KAMJv8SIvon4SY7u4A8Q.png" /></figure><h3>Conslusion</h3><p>With the release of <a href="https://github.com/alex566/DataScoutCompanion">DataScoutCompanion</a>, I’m offering a simple yet powerful way to bring data transparency directly into your app’s UI during development. I hope this tool makes it easier to inspect, debug, and understand your SwiftData (or raw SQLite) stores, and look forward at hearing your comments about it!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=3e79ef5a6bf6" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Behind the Scenes of DataScout: A SwiftData Debugging Tool]]></title>
            <link>https://medium.com/@djalex566/behind-the-scenes-of-datascout-a-swiftdata-debugging-tool-dcfc880f0733?source=rss-da87d9f8912c------2</link>
            <guid isPermaLink="false">https://medium.com/p/dcfc880f0733</guid>
            <category><![CDATA[product-design]]></category>
            <category><![CDATA[ios-app-development]]></category>
            <category><![CDATA[swiftdata]]></category>
            <category><![CDATA[core-data]]></category>
            <category><![CDATA[swift]]></category>
            <dc:creator><![CDATA[Oleksii Oliinyk]]></dc:creator>
            <pubDate>Wed, 23 Apr 2025 18:27:37 GMT</pubDate>
            <atom:updated>2025-04-24T08:57:01.236Z</atom:updated>
            <content:encoded><![CDATA[<p>There are plenty of apps on the App Store for inspecting Core Data and SQLite databases. Many of them have been around for years, packed with features and matured through long development cycles.</p><p>Not too long ago, Apple introduced a new database framework — SwiftData. It refines the Core Data experience and brings a more modern, Swift-native approach to data modeling and persistence. Since SwiftData is built on top of Core Data and SQLite, most of those older tools technically still work. But in my opinion, what they all lack is the same thing SwiftData brings to the table: <strong>simplicity</strong>.</p><p>That’s why I decided to build <a href="https://data-scout.pages.dev/"><strong>DataScout</strong></a> — a viewer designed specifically for debugging SwiftData apps. My goal was to create an interface that embraces the simplicity of SwiftData while still offering deep insights into your stored data.</p><p>In this article, you’ll get more than a feature tour — you’ll see the “why” and the “how” behind each decision. Every section is split into two parts: the design reasoning and the implementation details.</p><h3>Database Discovery: Simulators, Apps, and More</h3><h4>💡 Design</h4><p>Finding the right database should be as simple and frictionless as possible. To make that experience smooth, the app opens with a <strong>browser window</strong> front and center, helping users quickly locate the database file from all possible sources.</p><p>Here’s what’s available so far:</p><ul><li>A <strong>list of iOS simulators</strong>, with active ones and those containing data prioritized at the top</li><li>A <strong>list of apps and app groups</strong> for the currently open simulator</li><li>A <strong>folder list</strong> for any custom locations the user has added manually</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*lrR9cfIvvhm2Oxl79RDd3Q.png" /></figure><p>This structure makes it possible to get to the data with as few clicks as possible. That said, I realize the interface might look a bit busy at first glance. If you have ideas on how to simplify it further without sacrificing usability, feel free to leave a comment — I’m always open to improving the experience in future updates.</p><h4>🔧 How it works</h4><p>At first, I considered using simctl to list simulators and apps, but quickly ran into limitations in sandboxed environments. So instead, I went with a straightforward folder scan approach. Users are prompted to grant access to their Library folder (I know, it’s a big ask, but the idea was to eventually support scanning installed databases on macOS too).</p><p>Once permission is granted, the app can detect all available simulators with data, making it easier to get a full overview and maybe even realize it’s time to clean things up.</p><p>Support for custom folders is handled similarly, and again, the sandbox plays a big role here: you can’t just open a .sqlite file directly when it’s in WAL mode, SwiftData needs access to accompanying -wal and -shm files. So, instead of letting users pick a file, it requires full access to the containing folder to ensure everything loads properly.</p><h3>Your first look at an opened database</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*0tx6Rr9Lo4GUglqVz6_BmQ.png" /></figure><h4>💡 Design</h4><p>The window is structured to give both a high-level model&#39;s interface and access to more detailed information about its SQLite representation.</p><h4>🔧 How it works</h4><p>Most of the UI is built using SwiftUI, which makes it easy to iterate quickly and keep the layout declarative and predictable.</p><p>But the real workhorse here is the table view: it’s a fully custom, Metal-based renderer. That might sound like overkill for a database viewer, but it’s what allows the app to scroll through thousands of rows at a smooth 120 FPS, even on older Macs. I’ve covered more about this rendering approach in <a href="https://medium.com/@djalex566/fast-fluid-integrating-rust-egui-into-swiftui-30a218c502c1">this article</a>, so feel free to check that out if you’re curious about the performance side.</p><h3>Data Models Overview</h3><h4>💡 <strong>Design</strong></h4><p>The left sidebar serves as the primary way to explore the structure of your database. At the top, you’ll see a list of SwiftData models, laid out in a way that mirrors how you’d recognize them in your project, typically named after your Swift files.</p><p>Each model entry includes:</p><ul><li>The number of stored entities</li><li>Any inherited model types</li></ul><p>This section aims to keep a balance between familiarity and depth: giving you just enough context without overwhelming detail.</p><p>Below the model list is a more technical view showing the raw SQLite tables. While most users will spend their time with the Swift-style models, this lower-level section is available for those moments when you need to debug something a bit closer to the details.</p><h4>🔧 <strong>How it works</strong></h4><p>The model list is populated directly from the NSManagedObjectModel, making it a straightforward reflection of what SwiftData sees internally.<br> The table list is implemented by using SQLite directly to query all available tables in the database. It’s a fallback for users who want to inspect the raw schema or troubleshoot at the database level.</p><h3>Inspecting the Content</h3><p>Next, let’s take a look at the table view itself.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pXnwWu_mFcIE_Dtc0fORUA.png" /></figure><h3>Relationships Navigation</h3><h4>💡 <strong>Design</strong></h4><p>At the top left, you’ll find the next layer of navigation: <strong>relationship breadcrumbs</strong>. On the root level, this simply shows that you’re viewing all instances of the selected entity. But this part becomes especially useful when exploring relationships between models — it acts as a path indicator as you drill down into related data.</p><p>Most other tools tend to show relationships in a separate split view, which can be informative, but (at least in my experience) makes it harder to understand the actual path you’re following through the data. I’ve heard positive feedback specifically about this breadcrumb approach, so I’m pretty happy with how it turned out.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/650/1*DegheqwC6ZMzarWmvdw31Q.png" /></figure><h4>🔧 <strong>How it works</strong></h4><p>The UI for this part is fairly straightforward. The real challenge lies in keeping everything in sync. Since all related datasets are held in memory, any change in the data (like updates or deletions) has to be propagated through all open views and relationships. That adds some behind-the-scenes complexity to what appears like a simple, fluid navigation experience.</p><h3>Advanced Filtering with Predicate</h3><h4>💡 Design</h4><p>Just below relations is the <strong>predicate filter</strong>, which lets you refine the dataset based on specific conditions.</p><p>The predicate system is designed to behave as closely as possible to what you’d expect from real SwiftData or Core Data filtering. That turned out to be quite a challenge, especially when handling complex expressions and supporting different data types.</p><p>Some data types still aren’t supported yet, but I’m actively working on expanding coverage as I go.</p><p>The <strong>predicate editor</strong> itself also tries to recreate the feel of Xcode. It highlights known symbols and offers autocompletions to help you write valid predicates more easily. While it may seem like a small detail, it took a lot of behind-the-scenes work to make it feel smooth and natural.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*cuhf_iichyoIcde2ONsCXg.png" /></figure><h4>🔧 How it works</h4><p>This part is arguably the most complex implementation in the entire app, maybe even the biggest single chunk of code. I explored at least five different approaches to achieve the most accurate and reliable behavior:</p><ul><li>Running the code directly on the Swift toolchain</li><li>Compiling it into a binary and linking it to the app for execution</li><li>Mapping the string into NSPredicate directly</li><li>Compiling it as WebAssembly (WASM)</li><li>Running it inside a small custom virtual machine</li></ul><p>Each of these had serious drawbacks that ultimately made them unfeasible.</p><p>So I settled on a simpler (in theory) but still quite ambitious solution: building a function that takes a string as input and outputs a Swift Predicate, which could then be used to initialize an NSPredicate.</p><p>This turned out to be a massive effort. It required a lot of utility classes and internal logic to make the parsing and construction reliable. The implementation also takes advantage of a new Swift 6 feature: <strong>Implicitly Opened Existentials</strong>, which made some of the generic work easier.</p><p>One of the trickier blockers was creating a KeyPath from a string input. Thankfully, since our predicates are constrained to NSObject, we can use selector-backed KeyPaths, which can be constructed from plain strings. That simplification was key to making the system work.</p><p>Let’s break down all the steps a string goes through to finally become an NSPredicate:</p><ol><li>First, we validate that the code is actually correct. For the best accuracy, this step uses SourceKit diagnostics.</li><li>Then we parse the code into a syntax tree using SwiftSyntax.</li><li>Next, we expand the code using the Predicate macro implementation from Swift’s Foundation framework. This gives us the syntax tree of the predicate builder.</li><li>After that, we need to match all the build functions to their correct data types. Since Swift’s type inference is pretty advanced and not something you can easily replicate, we call SourceKit again to help out.</li><li>Now that we have both the tree and all the resolved types, we can rebuild those structures as a real Predicate instance.</li><li>And finally, we initialize the NSPredicate and use it to fetch the data.</li></ol><p>It’s a long process, and one that sounds overkill until you try to actually support the full range of what NSPredicate can do. Some people might suggest it’d be easier to just map the string directly into an NSPredicate. I’ve tried that too, and honestly, I can’t recommend it. It breaks down quickly as soon as you need anything more than basic comparisons.</p><p>But just processing the string correctly is only half the problem. The other half is building a code editor that highlights the syntax and handles autocomplete in a way that feels familiar, ideally close to Xcode.</p><p>The simplest part is reconstructing the string with syntax highlights from the abstract syntax tree. But it quickly became clear that without showing known symbols, the experience felt incomplete and unpleasant.</p><p>That’s when I had to go back to digging through my research. The most straightforward idea was to enable SourceKit’s debug mode in Xcode and track all the logs while editing. It helps, but it also brings a lot of confusion — sometimes the info isn’t enough, and to really track known symbols, you need to gather data from several kinds of information. The most obvious one is autocomplete.</p><p>So the next challenge is: once you start building up a list of known symbols, how do you keep them up to date while editing the code?</p><p>A common approach is to store symbol offsets along with extra metadata. But what happens when the user edits something earlier in the line and the offsets shift? In that case, you need to reposition everything correctly, and even then, you run into edge cases. For example, if the start of the expression changes and a symbol should become invalid, sometimes the editor still highlights it anyway.</p><p>But after spending more time testing Xcode, I realized it does the same thing. It doesn’t always immediately remove highlights either. So I took that as a green light.</p><p>With all of that in mind, I built the editor to combine information from multiple sources and update it live as you type. It took some effort to get right, but now it feels much smoother and much closer to the kind of experience you’d expect.</p><p>One missing piece was matching Xcode’s autocomplete behavior. I ended up simplifying that down to a single rule: <em>if the cursor is inside a method call declaration, show autocomplete suggestions filtered by the currently typed part of the name</em>. It’s not perfect, but in most cases, it feels right, and from the user’s perspective, that’s good enough.</p><p>It might seem like overkill to put this much work into just the filtering feature. But I really wanted it to feel natural and intuitive. I didn’t want to throw an unfamiliar syntax at the user or expect them to remember obscure NSPredicate formats. And I definitely didn’t want to build one of those filters made up of dropdowns and toggles — it always ends up being clunky. For me, this felt like the only solution that made sense.</p><h3>Smart Column Headers</h3><h4>💡 Design</h4><p>Now we can see the full list of model properties displayed as table column headers.</p><p>At first glance, this part might seem pretty straightforward, but there’s actually quite a bit happening under the hood as well.</p><p>Visually, the headers are designed to mirror how SwiftData model properties appear in your code. But they also offer <strong>additional details on hover</strong>, offering a deeper layer of insight without cluttering the main view.</p><p>One of the more subtle but useful features here is <strong>property ordering</strong>. The table automatically prioritizes key information: identification fields come first (like IDs), followed by properties that are typically more informative, such as name or title. This makes it easier to quickly spot the most important fields in your data.</p><p>Another hidden detail is the <strong>accuracy of the displayed type names</strong>. Instead of showing low-level Core Data types, the table presents the actual types used in your Swift code. This helps keep the experience aligned with how you think about your models while working in Swift.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/708/1*Lb7o4dF5SkLSDf997elcWg.png" /></figure><h4>🔧 How it works</h4><p>To determine the property order, it uses a small CoreML linear regression model that scores the relative importance of each property for the user. It’s a minor detail, but it really helps make large datasets easier to scan.</p><p>For the property types, the tool takes things a step further. Instead of relying on Core Data’s internal metadata, it analyzes the app’s compiled binary. When a database is opened from a specific simulator app and the tool can locate the corresponding binary, it can extract model metadata directly from it. That’s what allows it to show precise type names and match the visual structure of the model to how it’s written in your code.</p><h3>The Data Previews</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*LmGmBvHZvQkLMzujcPD7dw.png" /></figure><h4>💡 Design</h4><p>Now let’s finally look at the actual data. The values are pre-formatted for maximum readability, even if they’re stored as raw String or Data.</p><p>In the screenshot above, you can see a few examples:</p><ul><li>The UUID is highlighted with an emphasis on its numeric components.</li><li>The name appears like a string literal, making it immediately obvious that it’s a plain string.</li><li>The cells property is stored as raw NSData, but the format was recognized as JSON, so it’s highlighted and formatted accordingly.</li><li>The background property is a composite type. In Core Data, that’s stored as an NSDictionary; in SQLite, it’s split into separate columns. But here, it’s shown as a clean type exactly like in the code.</li></ul><p>This formatting logic is recursive, so even if a media type or composite value is deeply nested, the tool will still recognize it and present it in a human-friendly way.</p><p>If the data is too large to fit into the table, you can click the cell to open an expanded, formatted preview right in place.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/398/1*gkHEOykQ2dA6gMJu3m0QrQ.png" /></figure><p>In some cases, even if the preview isn’t fully implemented yet, the app still recognizes the format. That means the underlying detection logic is already in place, and full preview support will be added in future updates.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/442/1*imfdcQvvTkFe5la4NRgqcg.png" /></figure><p>If a property is a relationship, it’s styled with a distinct background to highlight that it’s clickable. It also shows a brief summary of the referenced entity to give you quick context.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/854/1*_KrPY9LgpKpQOXRjvNHHww.png" /></figure><p>Clicking the reference lets you drill down into that related data, and the breadcrumb at the top updates to reflect the navigation path. You can also jump back to any previous point in the path by clicking its segment.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*EC2f8DYuJz4qYPEaiex5CA.png" /></figure><h4>🔧 How it works</h4><p>After the data is fetched from the database, it’s passed to the Rust-based core of the app, which is optimized to handle the extra processing needed for a smooth and readable preview.</p><p>Most types have a pre-defined appearance and can be directly transformed into attributed strings. But for more complex cases, where the raw value isn’t very informative, the pre-processor runs additional checks to detect known formats. For example, it can tokenize JSON for syntax highlighting.</p><p>It’s also responsible for merging metadata from multiple sources, like type information extracted from the compiled binary or scoring hints from the linear regression model, to build previews in a meaningful way.</p><h3>Event Console</h3><h4>💡 Design</h4><p>Finally, at the bottom of the screen, we have the console, where you can see a timeline of events related to the table, along with how long each step took.</p><p>The first event you’ll usually see is the initial analysis phase I mentioned earlier, where extra data type information is collected. Then comes the event showing how long it took to load the data, followed by the time needed to recognize and format the data entries.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pmRxt8OD99U5cLWcd-oMTw.png" /></figure><p>There’s also a dedicated tab showing the full process of building the predicate, broken down step by step.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/842/1*2lzCoWA_Elw5SOW3hDqZ9w.png" /></figure><p>All the steps from the Predicate section described earlier are logged here, and each one can be expanded to see the exact result of that stage.</p><h4>🔧 How it works</h4><p>Technically, there’s nothing too complex about this part, it’s just a simple SwiftUI List. However, despite its simplicity, it doesn&#39;t perform as well as expected. Sometimes, scrolling lags, and the expanding functionality breaks intermittently. At this point, there are only two viable solutions: either move this component to the Rust-based part of the UI or wait for Apple to improve the List. In macOS 15.4.1, there has been noticeable progress; scrolling lag has almost disappeared, but I still hope they’ll address the expansion issue as well.</p><h3>Live Updates</h3><h4>💡 Design</h4><p>When the user interacts with the app in the simulator and the data changes, the updates are reflected in real-time within DataScout. Changes to Core Data are highlighted, making it easy to pinpoint where the change occurred while continuously displaying relevant information. This simple yet effective approach is what I’ve opted for so far.</p><h4>🔧 How it works</h4><p>This feature is fairly straightforward. It processes the most recent transactions, walks through the displayed data, and updates it accordingly. By leveraging additional indexing of IDs, it quickly identifies modified objects and refreshes their values along with the timestamp of the last modification. The table then triggers a blink animation based on the updated timestamp, visually highlighting the changes.</p><h3>Bonus: Extra Databases</h3><p>Since the table view and data recognition are implemented as independent components, I can also connect it directly to SQLite for scenarios where Core Data isn’t used. This retains the data format recognition and relationship navigation features but loses the ability to track recent transactions, as the storage configuration can vary. As a result, the app won’t highlight changes in the UI in this case. While there are opportunities to build more advanced solutions for this later, my backlog is already quite full. For this integration, the app uses Turso’s <a href="https://github.com/tursodatabase/libsql">libsql</a> instead of standard SQLite, offering additional features while maintaining compatibility with SQLite.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*FWiW385HVQXir0Xd0MFAww.png" /></figure><p>I’ve also implemented an alternative presentation mode for NoSQL data structures, though currently, this UI is only used for one additional database format, which was popular in Flutter — Hive. I added it in an effort to broaden the app’s audience, particularly for Flutter developers, but it hasn’t garnered much attention. In the future, I plan to repurpose this UI for other NoSQL databases commonly used in mobile development.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*og_siBUb-9x0aePS7Z7srA.png" /></figure><h4>🔧 How it works</h4><p>In the code, I had to provide an alternative tree view for the content, in addition to the table format, and I think this could be useful for other scenarios as well, such as reusing it for the console.</p><p>Regarding the database itself, since the original implementation is based on Dart, I re-implemented a specific part for reading the data in Rust and adjusted the logic to improve the accuracy of recognizing unknown types. However, I’m still struggling to gather any feedback from Flutter developers. In hindsight, it might not have been worth the effort.</p><h3>Bonus: Work in Progress and Future Ideas</h3><p>The last thing I want to mention is the future direction of the app.</p><p>While it’s already packed with useful functionality, there are still some basic features missing that could be critical for certain users. I’m currently evaluating the best ways to integrate these into the UI. Some of these features include:</p><ul><li>Setting the sorting order</li><li>Adding missing data types for Predicate</li><li>Displaying the schema visually</li><li>Showing images within the table UI</li><li>Handling encrypted databases</li><li>Implementing a simple search for text entries</li></ul><p>These features will be added as soon as I find a way to seamlessly integrate them into the app’s existing style, keeping the approach simple and close to the code representation.</p><p>Besides that, there are also some larger plans I’m still researching, such as:</p><ul><li><strong>Linking the database from a physical iPhone (or even Android) using a local network and an extra SDK for debugging.</strong> This feature is partially <a href="https://youtu.be/JLxkUCnV2Sg">functional</a> in my test builds but is currently blocked by the specifics of SQLite storage (due to shared memory corruption during transfer when the database is in WAL mode). I’ve also vibe coded an <a href="https://github.com/alex566/file-mirror">SDK</a> for initial testing (Disclaimer: the vibe of this code is still rough, mostly suitable for prototyping. Only a small part has been polished so far, but I plan to refine it soon). The initial approach doesn’t seem viable for SQLite, so I’m planning to implement another SDK using the libSQL replication feature instead.</li><li><strong>Editing data</strong> is a major topic, but could be tricky, especially when data is heavily formatted for better representation or when it’s linked via a network.</li><li><strong>Auto-detecting Queries</strong> used in the app views and showing them as a list, following the list of models. This one is complex and will require adding some missing features to the Predicate (like variables), but it should be achievable.</li></ul><h3>Conclusion</h3><p>DataScout started out as my personal playground for experimenting with SwiftSyntax, SourceKit, CoreML, Rust, and more. Along the way, I built features like relationship breadcrumbs, live updates, Metal-powered tables, and smart predicate parsing — each one driven by the goal of making database inspection feel as natural and code-like as possible.</p><p>But beyond my own sandbox, I’ve seen real value in the app. Features like in-place data previews, custom type highlighting, and breadcrumb navigation have already helped me and a few users.</p><p>Looking ahead, I’ll keep adding missing pieces. My guiding principle remains the same: keep the UI simple, stay close to how you write code, and never hide useful information behind too many menus.</p><p>If you’ve enjoyed this walkthrough or have ideas for DataScout, drop a comment or reach out. Your feedback will help shape the next updates and keep this tool both useful and easy to use.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=dcfc880f0733" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Fast & Fluid: Integrating Rust egui into SwiftUI]]></title>
            <link>https://medium.com/@djalex566/fast-fluid-integrating-rust-egui-into-swiftui-30a218c502c1?source=rss-da87d9f8912c------2</link>
            <guid isPermaLink="false">https://medium.com/p/30a218c502c1</guid>
            <category><![CDATA[macos-development]]></category>
            <category><![CDATA[egui]]></category>
            <category><![CDATA[swiftui]]></category>
            <category><![CDATA[appkit]]></category>
            <category><![CDATA[rust]]></category>
            <dc:creator><![CDATA[Oleksii Oliinyk]]></dc:creator>
            <pubDate>Wed, 26 Mar 2025 10:48:40 GMT</pubDate>
            <atom:updated>2025-03-26T10:48:40.252Z</atom:updated>
            <content:encoded><![CDATA[<p>Let’s be honest: the path we’re about to take in this article isn’t a simple one. If you’re considering this approach for your app, you should be fully aware of the risks involved. That said, I took this risk in my <a href="https://apps.apple.com/us/app/datascout-for-sqlite-swiftdata/id6737813684">hobby app</a>, and so far, I’m really happy with the <a href="https://youtu.be/Udn3yr2-D20">results</a>. So, if you’re up for an experiment, let’s dive in!</p><p>A little backstory on how I ended up deciding to spice up my app with Rust: A few months ago, I was working on a simple app to display the contents of a database while debugging. The goal was to quickly spot errors while implementing cloud sync. I planned to spend just a couple of evenings on it, but things took an unexpected turn.</p><p>SwiftUI’s Table had pretty poor performance. My first idea was to replace it with NSTableView, but to my surprise, it turned out to be just as limited and unstable. At that point, this issue completely captured my attention—I had already decided to turn this into a proper, polished app that I could submit to the App Store. But optimizing old and buggy table views didn’t seem like the most exciting challenge.</p><p>Then it hit me: with no looming deadlines, I could explore new technical horizons. A challenging idea emerged — why not seek out the fastest UI library from an unexpected language and find a way to integrate it into my app?</p><p>After a bit of searching, egui caught my attention. Unlike traditional UI frameworks, egui uses an immediate mode approach, meaning the UI is redrawn every frame instead of maintaining persistent state objects. This meant I didn’t have to worry about managing state updates, and the UI rebuilding process seemed incredibly fast. So, I decided to give egui a shot.</p><p>Now, let’s move on to the core idea I want to explore: keeping the high-level structure of the app in SwiftUI for ease of handling while isolating egui in a specific, performance-sensitive part. The goal is to establish a smooth and convenient way to communicate between Swift and Rust.</p><p>As an example, we’ll build a simple chat window where the active conversation can be switched from the Swift side. The full source code will be linked at the end of the article.</p><p>I’ll be following the same steps in this article as I do when implementing the sample app.</p><blockquote><strong>Disclaimer:</strong> This article does not cover all related topics in detail, but it provides links to documentation where you can find more information. It assumes you have a basic understanding of macOS development with SwiftUI, some experience with Rust, and a fundamental knowledge of GPU programming. The focus will be on guiding you through the high-level steps rather than explaining every concept from scratch.</blockquote><h3>Building the bridge</h3><p>At first, I attempted to establish communication using a simple FFI, but the data transfer implementation proved time-consuming and fraught with memory management challenges. Ultimately, I turned to the <a href="https://github.com/chinedufn/swift-bridge">swift-bridge</a> library to simplify the process. Let’s follow the same approach here.</p><p>Let’s begin by creating a library package for our Rust UI:</p><pre>cargo init --lib --name smooth-ui</pre><p>Next, modify the Cargo.toml to configure it as a static library:</p><pre>[lib]<br>name = &quot;embed&quot;<br>crate-type = [&quot;staticlib&quot;]</pre><p>I recommend setting up the swift-bridge dependency in a separate Swift package from <a href="https://chinedufn.github.io/swift-bridge/building/swift-packages/index.html">this</a> article, though you can choose an alternative method that suits your workflow. After setting up the dependency, create a file to define the interface visible to Swift:</p><pre>#[swift_bridge::bridge]<br>mod ffi {<br>    extern &quot;Rust&quot; {<br>    }<br>}</pre><p>I’ve also included a build script in the <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/smooth-ui/scripts/build_debug.sh">project</a> that you can reference and copy into your own work.</p><p><strong>Tip:</strong> If you’re new to this setup, I strongly recommend implementing a simple “Hello World” function and calling it from Swift first. This approach will help you gain confidence and clarity before diving into more complex implementations.</p><h3>Creating the Renderer</h3><p>With our project configured, we can now create our Renderer responsible for handling graphics rendering for the UI:</p><pre>pub struct Renderer {<br>    device: wgpu::Device,<br>    queue: wgpu::Queue,<br>    surface: wgpu::Surface&lt;&#39;static&gt;,<br>    config: wgpu::SurfaceConfiguration,<br><br>    // The rest will come on egui part<br>}</pre><p>Let’s define the renderer’s interface with three key functions essential for this tutorial:</p><pre>impl Renderer {<br><br>    // layer_ptr is the pointer to our CAMetalLayer from AppKit/UIKit side<br>    pub fn new(layer_ptr: *mut std::ffi::c_void, width: u32, height: u32) -&gt; Self {<br>        let descriptor = wgpu::InstanceDescriptor {<br>            backends: wgpu::Backends::METAL,<br>            ..Default::default()<br>        };<br>        let instance = wgpu::Instance::new(&amp;descriptor);<br>        // Create a surface with our metal layer, passed from the swift side<br>        let surface = unsafe {<br>            instance.create_surface_unsafe(wgpu::SurfaceTargetUnsafe::CoreAnimationLayer(layer_ptr)).unwrap()<br>        };<br><br>        // The rest of the setup from https://sotrh.github.io/learn-wgpu/beginner/tutorial2-surface/#state-new<br>        // Or check the final source code for the complete picture<br>    } <br><br>    pub fn resize(&amp;mut self, width: u32, height: u32) {<br>        // The implementation from https://sotrh.github.io/learn-wgpu/beginner/tutorial2-surface/#resize<br>    }<br><br>    pub fn render(&amp;mut self, time: f64) {<br>        // Check https://sotrh.github.io/learn-wgpu/beginner/tutorial2-surface/#render <br>        // Or final source code for the complete picture with egui<br>    }<br>}</pre><p>For those unfamiliar with GPU programming, I recommend following the wgpu <a href="https://sotrh.github.io/learn-wgpu/beginner/tutorial2-surface/">tutorial</a> to implement the remaining setup process based on your specific requirements. The critical aspect here is configuring the surface for our Metal layer.</p><p>Now, we’ll add this interface to the FFI module to enable access from Swift code:</p><pre>use super::renderer::Renderer;<br><br>use std::ffi::c_void;<br><br>#[swift_bridge::bridge]<br>mod ffi {<br>    extern &quot;Rust&quot; {<br>        type Renderer;<br><br>        #[swift_bridge(init)]<br>        fn new(layer_ptr: *mut c_void, width: u32, height: u32) -&gt; Renderer;<br><br>        fn resize(&amp;mut self, width: u32, height: u32);<br><br>        fn render(&amp;mut self, time: f64);<br>    }<br>}</pre><p>Key points to note:</p><ul><li>The implementation focuses on setting up a Metal-based rendering surface</li><li>Detailed setup can be referenced in the suggested tutorials</li><li>The FFI module provides a clean interface for Swift interaction</li></ul><h3>Implementing the Swift Renderer</h3><p>Let’s start by creating an NSView backed by CAMetalLayer to display our Rust-generated UI:</p><pre>import Metal<br>import AppKit<br>import SmoothUI // The package generated with swift-bridge<br><br>@MainActor<br>final class SmoothNSView: NSView {<br>    <br>    var metalLayer: CAMetalLayer {<br>        layer as! CAMetalLayer<br>    }<br>    <br>    override init(frame frameRect: NSRect) {<br>        super.init(frame: frameRect)<br>        wantsLayer = true<br>    }<br>    <br>    override func makeBackingLayer() -&gt; CALayer {<br>        CAMetalLayer()<br>    }<br>}</pre><p>Next, we’ll create a renderer controller to manage view callbacks:</p><pre>import Metal<br>import AppKit<br>import SmoothUI<br><br>@MainActor<br>final class SmoothRendererController: NSObject {<br>    private weak var view: SmoothNSView!<br>    private var displayLink: CADisplayLink!<br>    <br>    private var renderer: Renderer!<br>    <br>    deinit {<br>        MainActor.assumeIsolated {<br>            displayLink.invalidate()<br>        }<br>    }<br>    <br>    func initialize(view: SmoothNSView, size: CGSize, scale: CGFloat) {<br>        self.view = view<br>        <br>        let layer = view.metalLayer<br>        layer.framebufferOnly = true<br>        layer.pixelFormat = .bgra8Unorm_srgb<br>        layer.drawableSize = size<br>        layer.contentsScale = scale<br>        <br>        let rawPointer = Unmanaged.passUnretained(layer).toOpaque()<br>        renderer = Renderer(<br>            rawPointer,<br>            UInt32(max(size.width, 100.0) * scale),<br>            UInt32(max(size.height, 100.0) * scale)<br>        )<br>    }<br>    <br>    func start(view: SmoothNSView) {<br>        let link = view.displayLink(target: self, selector: #selector(render))<br>        link.add(to: .main, forMode: .common)<br>        self.displayLink = link<br>    }<br>    <br>    @objc<br>    func render(displayLink: CADisplayLink) {<br>        renderer.render(<br>            Date.now.timeIntervalSince1970<br>        )<br>    }<br>    <br>    func resize(to size: CGSize, scale: CGFloat) {<br>        guard size.width &gt; 0 &amp;&amp; size.height &gt; 0 else {<br>            return<br>        }<br>        renderer.resize(<br>            UInt32(size.width * scale),<br>            UInt32(size.height * scale)<br>        )<br>    }<br>}</pre><p>To integrate with SwiftUI, we’ll create an NSViewRepresentable wrapper:</p><pre>import SwiftUI<br>import AppKit<br><br>struct SmoothView: View {<br>    <br>    @Environment(\.displayScale)<br>    var displayScale<br>    <br>    var body: some View {<br>        GeometryReader { proxy in<br>            SmoothNSViewRepresentable(<br>                size: proxy.size,<br>                scale: displayScale<br>            )<br>        }<br>    }<br>}<br><br>private struct SmoothNSViewRepresentable: NSViewRepresentable {<br>    let size: CGSize<br>    let scale: CGFloat<br>    <br>    func makeNSView(context: Context) -&gt; SmoothNSView {<br>        let view = SmoothNSView(frame: .zero)<br>        context.coordinator.initialize(view: view, size: size, scale: scale)<br>        context.coordinator.start(view: view)<br>        return view<br>    }<br>    <br>    func updateNSView(_ nsView: SmoothNSView, context: Context) {<br>        context.coordinator.resize(to: size, scale: scale)<br>    }<br>    <br>    func makeCoordinator() -&gt; SmoothRendererController {<br>        SmoothRendererController()<br>    }<br>}</pre><p>Finally, we can embed the view in our app’s UI:</p><pre>import SwiftUI<br><br>struct ChatListView: View {<br>    <br>    var body: some View {<br>        NavigationSplitView {<br>            List {<br>                Text(&quot;Chat 1&quot;)<br>                Text(&quot;Chat 2&quot;)<br>            }<br>        } detail: {<br>            SmoothView()<br>        }<br>    }<br>}</pre><p>At this point, running the project should set up rendering for SmoothView on the Rust side and display it within the SwiftUI view hierarchy.</p><p><strong>Tip:</strong> If you’re new to GPU programming, I recommend implementing a simple triangle rendering from the wgpu <a href="https://sotrh.github.io/learn-wgpu/beginner/tutorial2-surface/">documentation</a> before proceeding further. This will help you understand the basic rendering concepts and setup.</p><h3>Render Our first egui label</h3><p>Let’s add the necessary dependencies to your Cargo.toml:</p><pre>wgpu = &quot;23.0&quot;<br>egui = &quot;0.30.0&quot;<br>egui_wgpu_backend = &quot;0.33.0&quot;</pre><p>A quick note on the backend: While I’m using egui_wgpu_backend for this tutorial, my experience with it has been nuanced. In my database project, I ultimately developed a custom rendering backend because the existing implementation didn&#39;t meet my performance requirements.</p><p>The standard egui_wgpu_backend seemed primarily optimized for WebGL, and I found its Metal rendering performance less than ideal. On Metal, a single frame was taking around 10 milliseconds, which wasn&#39;t acceptable for my use case. Through a custom implementation, I managed to reduce frame rendering time to just 1-2 milliseconds.</p><p>This performance optimization journey is a fascinating topic in itself. I’m considering open-sourcing my custom implementation in the future, which could be the subject of a separate, in-depth article. For now, we’ll proceed with the standard egui_wgpu_backend to demonstrate the core integration.</p><p>Let’s update our Renderer struct with additional properties for egui:</p><pre>pub struct Renderer {<br>    // wgpu<br>    // ...<br><br>    // egui<br>    context: egui::Context,<br>    raw_input: egui::RawInput,<br>    egui_rpass: RenderPass<br>}</pre><p>Now, we’ll modify the initialization function to set up egui:</p><pre>pub fn new(layer_ptr: *mut std::ffi::c_void, width: u32, height: u32, display_scale: f32) -&gt; Self {<br>    // Setup wgpu<br>    // ...<br><br>    // Setup egui<br>    let context = egui::Context::default();<br>    let raw_input = egui::RawInput {<br>        viewport_id: egui::ViewportId::ROOT,<br>        viewports: std::iter::once((egui::ViewportId::ROOT, egui::ViewportInfo {  <br>            native_pixels_per_point: Some(display_scale),<br>            focused: Some(true),<br>            ..Default::default() <br>        })).collect(),<br>        predicted_dt: 1.0 / 120.0, // Setup according to your needs<br>        focused: true,<br>        system_theme: None,<br>        max_texture_side: Some(wgpu::Limits::default().max_texture_dimension_2d as usize),<br>        ..Default::default()<br>    };<br>    let egui_rpass = RenderPass::new(&amp;device, tex_format, 1);<br><br>    // ...<br>}</pre><p>The rendering function involves two critical steps:</p><ol><li>Updating the UI state</li><li>Sending the rendered UI to the GPU</li></ol><p>Here’s an overview of the rendering process:</p><pre>pub fn render(&amp;mut self, time: f64) {<br>    <br>    let device = &amp;self.device;<br>    let queue = &amp;self.queue;<br>    let surface = &amp;self.surface;<br><br>    let ctx = &amp;self.context;<br>    let egui_rpass = &amp;mut self.egui_rpass;<br>    <br>    // Setup the input for the egui<br>    self.raw_input.time = Some(time);<br><br>    let rect = egui::Rect::from_min_size(<br>        egui::Pos2::ZERO, <br>        egui::vec2(<br>            self.config.width as f32 / ctx.pixels_per_point(), <br>            self.config.height as f32 / ctx.pixels_per_point()<br>        )<br>    );<br>    self.raw_input.screen_rect = Some(rect);<br><br>    // Run a single layout pass<br>    let full_output = ctx.run(self.raw_input.take(), |ctx| {<br>        egui::CentralPanel::default().show(&amp;ctx, |ui| {<br>            ui.label(&quot;Hello, world!&quot;);<br>        });<br>    });<br><br>    // Prepare resources for rendering<br>    let paint_jobs = ctx.tessellate(full_output.shapes, ctx.pixels_per_point());<br><br>    // Handle rendering routines<br>    let frame = surface.get_current_texture().expect(&quot;Failed to get next frame&quot;);<br>    let view = frame.texture.create_view(&amp;wgpu::TextureViewDescriptor::default());<br><br>    let mut encoder = device.create_command_encoder(&amp;wgpu::CommandEncoderDescriptor {<br>        label: Some(&quot;Render Encoder&quot;),<br>    });<br><br>    let screen_descriptor = ScreenDescriptor {<br>        physical_width: self.config.width,<br>        physical_height: self.config.height,<br>        scale_factor: ctx.pixels_per_point(),<br>    };<br>    let tdelta: egui::TexturesDelta = full_output.textures_delta;<br><br>    egui_rpass<br>        .add_textures(&amp;device, &amp;queue, &amp;tdelta)<br>        .expect(&quot;add texture ok&quot;);<br><br>    egui_rpass.update_buffers(&amp;device, &amp;queue, &amp;paint_jobs, &amp;screen_descriptor);<br><br>    // Execute all render passes.<br>    egui_rpass<br>        .execute(<br>            &amp;mut encoder,<br>            &amp;view,<br>            &amp;paint_jobs,<br>            &amp;screen_descriptor,<br>            Some(wgpu::Color::BLACK),<br>        )<br>        .unwrap();<br><br>    // Submit commands to the GPU<br>    self.queue.submit(Some(encoder.finish()));<br><br>    // Present the frame<br>    frame.present();<br><br>    // Clean up resources after using<br>    egui_rpass.remove_textures(tdelta).unwrap();<br>}</pre><p>When implemented correctly, you’ll see your first egui label rendered within the SwiftUI view.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*D99GqATbr5XtIAPJbVIrLg.png" /></figure><p>This marks our first significant milestone, but there’s still much to explore and optimize in the integration process.</p><h3>Crafting a Native-Feeling User Interface</h3><p>Our initial implementation was functional, but now it’s time to refine the UI to make it more user-friendly and system-integrated. Both implementations of SwiftUI and Rust parts can be checked <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/SmoothChat/SmoothChat/Views/ChatListView.swift">here</a> and <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/smooth-ui/src/chat_ui.rs">here</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*HFBkyQkrPMLp_nKjRH4ikg.png" /></figure><p>Now it looks better, but something is still bothering me as a macOS user. I want to see the SF font in my UI, just like in the rest of the system, so let’s set up system fonts.</p><p>For this purpose, I found a handy library in Rust:</p><pre>font-kit = &quot;0.14.2&quot;</pre><p>Now we can extend our egui setup with font configurations:</p><pre>// Setup egui<br>let context = egui::Context::default();<br><br>let display_font_data = load_font_data_by_name(&quot;SF Pro Text Medium&quot;);<br><br>if let Ok(display_font_data) = display_font_data {<br><br>    let mut fonts = egui::FontDefinitions::default();<br><br>    let display_font_data = Arc::new(egui::FontData::from_owned(display_font_data));<br>    fonts.font_data.insert(&quot;SF-Pro-Text-Medium&quot;.to_owned(), display_font_data);<br><br>    let display_family = egui::FontFamily::Name(&quot;SF-Pro-Text&quot;.into());<br>    fonts.families.insert(display_family.clone(), vec![&quot;SF-Pro-Text-Medium&quot;.to_owned()]);<br><br>    context.set_fonts(fonts);<br>    <br>    context.all_styles_mut(|style| {<br>        let text_styles: BTreeMap&lt;_, _&gt; = [<br>            (egui::TextStyle::Heading, egui::FontId::new(11.0, display_family.clone())),<br>            (egui::TextStyle::Body, egui::FontId::new(13.0, display_family.clone())),<br><br>            (egui::TextStyle::Button, egui::FontId::new(14.0, display_family.clone())),<br>            (egui::TextStyle::Small, egui::FontId::new(10.0, display_family.clone())),<br>        ].into();<br>        <br>        style.text_styles = text_styles;<br>    });<br>}</pre><p>Now the result looks much better, and as a bonus, it can display SF Symbols.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*WjTnvoPDCFg9tyVeeABzLg.png" /></figure><h3>Connecting Input/Output</h3><p>We’ve reached the most challenging part of our integration. Handling input and output can be complex, as the state provided by egui might conflict with the state requested by the SwiftUI view. However, I’ve developed a compromise that should work.</p><p>Let’s start by creating input events for mouse interactions on the Rust side:</p><pre>pub enum InputEvent {<br>    PointerMoved(f32, f32),<br>    MouseWheel(f32, f32),<br>    LeftMouseDown(f32, f32, bool),<br>    RightMouseDown(f32, f32, bool),<br>    WindowFocused(bool),<br>    // And the rest of the events for keyboard if needed<br>}<br><br>impl Into&lt;Event&gt; for InputEvent {<br>   <br>    fn into(self) -&gt; Event {<br>        match self {<br>            InputEvent::PointerMoved(x, y) =&gt; Event::PointerMoved(egui::Pos2::new(x, y)),<br>            InputEvent::MouseWheel(x, y) =&gt; Event::MouseWheel { unit: egui::MouseWheelUnit::Point, delta: egui::vec2(x, y), modifiers: Modifiers::default() },<br>            InputEvent::LeftMouseDown(x, y, pressed) =&gt; Event::PointerButton { pos: egui::Pos2::new(x, y), button: egui::PointerButton::Primary, pressed, modifiers: Modifiers::default() },<br>            InputEvent::RightMouseDown(x, y, pressed) =&gt; Event::PointerButton { pos: egui::Pos2::new(x, y), button: egui::PointerButton::Secondary, pressed, modifiers: Modifiers::default() },<br>            InputEvent::WindowFocused(focused) =&gt; Event::WindowFocused(focused),<br>        }<br>    }<br>}</pre><p>Now, we’ll update our render function to handle input events before updating the UI:</p><pre>pub fn render(&amp;mut self, time: f64, input_events: Vec&lt;InputEvent&gt;, /* other arguments */) {<br>    <br>    // Out previous input setup<br>    self.raw_input.events = input_events.into_iter().map(|e| e.into()).collect();<br><br>    // Update the UI and render the result<br>}</pre><p><strong>Important Note:</strong> Remember to add these events to the FFI to make them visible to Swift. The full FFI setup can be intricate, so I recommend checking the <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/smooth-ui/src/ffi.rs">complete</a> implementation for detailed specifics.</p><p>Now, let’s return to the Swift project and create a <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/SmoothChat/SmoothChat/Views/SmoothView/InputOutputHandlingView.swift">new</a> NSView for input handling. Our SmoothNSView will inherit from this custom view:</p><pre>class InputOutputHandlingView: NSView {<br>    <br>    private var trackingArea: NSTrackingArea?<br>    private var gatheredEvents: [InputEvent] = []<br>    <br>    override var acceptsFirstResponder: Bool {<br>        true<br>    }<br>    <br>    func draingEvents() -&gt; RustVec&lt;InputEvent&gt; {<br>        // Collect events into rust vector and remove all currently gathered events<br>    }<br>    <br>    // Install tracking area every time the view was resized<br>    // To track the pointer movement<br>    override func updateTrackingAreas() {<br>        super.updateTrackingAreas()<br>        <br>        if let trackingArea {<br>            removeTrackingArea(trackingArea)<br>        }<br>        <br>        let options: NSTrackingArea.Options = [<br>            .mouseEnteredAndExited,<br>            .mouseMoved,<br>            .activeInKeyWindow<br>        ]<br>        let trackingArea = NSTrackingArea(rect: bounds, options: options, owner: self, userInfo: nil)<br>        addTrackingArea(trackingArea)<br>        <br>        self.trackingArea = trackingArea<br>    }<br><br>    // MARK: - Movement tracking<br>    <br>    override func mouseEntered(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>    }<br>    <br>    override func mouseMoved(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>    }<br>    <br>    override func mouseExited(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>    }<br>    <br>    // MARK: - Mouse Button and Scroll Events<br>    <br>    override func mouseDown(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.mouseDown(with: event)<br>    }<br>    <br>    override func mouseUp(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.mouseUp(with: event)<br>    }<br>    <br>    override func mouseDragged(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.mouseDragged(with: event)<br>    }<br>    <br>    override func rightMouseDown(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.rightMouseDown(with: event)<br>    }<br>    <br>    override func rightMouseUp(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.rightMouseUp(with: event)<br>    }<br>    <br>    override func rightMouseDragged(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.rightMouseDragged(with: event)<br>    }<br>    <br>    override func scrollWheel(with event: NSEvent) {<br>        // Map into our InputEvent and add to gatheredEvents<br>        super.scrollWheel(with: event)<br>    }<br>}</pre><p>With this implementation, we can now gather input events from the view and pass them to the renderer:</p><pre>@objc<br>func render(displayLink: CADisplayLink) {<br>    let events = view.draingEvents()<br>    renderer.render(<br>        displayLink.timestamp,<br>        events,<br>        controller<br>    )<br>}</pre><p>Now we can scroll our chat and observe the smooth scrolling in action.</p><p>However, there is still one crucial detail missing: our system doesn’t react to what is happening inside the view. To address this, we need to develop a way to connect the output. Let’s use cursor switch events as an example.</p><p>We’ll start by creating additional structures on the Rust side:</p><pre>pub struct OutputState {<br>    cursor_icon: CursorIcon,<br>}<br><br>pub enum CursorIcon {<br>    Default,<br>    PointingHand,<br>    ResizeHorizontal,<br>    ResizeVertical,<br>    Text,<br>}<br><br>impl From&lt;egui::CursorIcon&gt; for CursorIcon {<br>    <br>    fn from(cursor_icon: egui::CursorIcon) -&gt; Self {<br>        match cursor_icon {<br>            egui::CursorIcon::Default =&gt; Self::Default,<br>            egui::CursorIcon::PointingHand =&gt; Self::PointingHand,<br>            egui::CursorIcon::ResizeHorizontal | egui::CursorIcon::ResizeColumn =&gt; Self::ResizeHorizontal,<br>            egui::CursorIcon::ResizeVertical | egui::CursorIcon::ResizeRow =&gt; Self::ResizeVertical,<br>            egui::CursorIcon::Text =&gt; Self::Text,<br>            default =&gt; {<br>                println!(&quot;Unsupported cursor icon: {:?}&quot;, default);<br>                Self::Default<br>            }<br>        }<br>    }<br>}</pre><p>Next, we’ll update the render function to return this state after rendering the UI:</p><pre>    pub fn render(&amp;mut self, /* other arguments */) -&gt; OutputState {<br>        <br>        // Updating UI and rendering<br><br>        OutputState::new(full_output.platform_output.cursor_icon.into())<br>    }</pre><p>The FFI module should be updated accordingly. If you encounter any issues with this part, you can refer to <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/smooth-ui/src/ffi.rs">this</a> FFI module implementation for guidance.</p><p>Now we can add output handling to our <a href="https://github.com/alex566/swiftui-egui-demo/blob/main/SmoothChat/SmoothChat/Views/SmoothView/InputOutputHandlingView.swift">InputOutputHandlingView</a>:</p><pre>func handle(output: OutputState) {<br>    let setCursor = output.get_cursor_icon()<br>    let newCursor = cursorToNSCursor(setCursor)<br>    <br>    if currentCursor != newCursor {<br>        if let activeCursor = currentCursor {<br>            activeCursor.pop()<br>            currentCursor = nil<br>        }<br>        <br>        newCursor.push()<br>        currentCursor = newCursor<br>    }<br>}<br><br>func resetCursor() {<br>    if let activeCursor = currentCursor {<br>        activeCursor.pop()<br>        currentCursor = nil<br>    }<br>}</pre><p>Next, connect the result of rendering to this handler:</p><pre>@objc<br>func render(displayLink: CADisplayLink) {<br>    let events = view.draingEvents()<br>    let state = renderer.render(<br>        displayLink.timestamp,<br>        events,<br>        controller<br>    )<br>    view.handle(output: state)<br>}</pre><p>And that’s it. When you run the project, you’ll now be able to see the cursor switch when pointing at text. However, you won’t yet be able to copy text, as that would require implementing additional output handling. This concludes the core implementation of integrating egui with SwiftUI.</p><h3>Conclusion</h3><p>In this article, I’ve provided a basic overview of the mechanisms involved, but many details are still left out. This should serve as a good starting point for you to implement these concepts on your own. If you spot any errors or have suggestions for improving my implementation, I would be grateful to hear your thoughts. Let’s evolve this into something even better together! And a big thank you if you’ve made it this far!</p><h3>Open points</h3><ul><li>I’m still considering how to wrap this into a utility library with a more convenient interface. If you have any ideas, I’d love to hear them!</li><li>Additionally, the architecture might feel a bit tricky, as both the SwiftUI and Rust parts essentially need their own “view” and “controller” layers. If you have thoughts on how to improve this aspect, feel free to share.</li></ul><p>The full source code can be found <a href="https://github.com/alex566/swiftui-egui-demo">here</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=30a218c502c1" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>