<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet href="/styles.xsl" type="text/xsl"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Nick Winans</title><description>A blog by Nick Winans</description><link>https://nick.winans.io/</link><item><title>Attention Span August</title><link>https://nick.winans.io/blog/attention-span-august/</link><guid isPermaLink="true">https://nick.winans.io/blog/attention-span-august/</guid><description>A month without social media or recommendation algorithms.</description><pubDate>Thu, 31 Jul 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;This August, I&apos;m taking a month-long break from social media and recommendation algorithms&lt;sup&gt;&lt;a href=&quot;#fn1&quot;&gt;[1]&lt;/a&gt;&lt;/sup&gt;. This is a personal challenge to see how it affects my attention span, productivity, and overall mental health.&lt;/p&gt;
&lt;h2&gt;Why?&lt;/h2&gt;
&lt;p&gt;We all know that social media and their recommendation algorithms are made to keep us engaged for as long as possible. They do a really good job at it, and I often find myself scrolling mindlessly, every once in a while becoming self-aware and wondering why I just wasted a minute watching a clip of a show I&apos;ve never even heard of.&lt;/p&gt;
&lt;p&gt;Six months ago, I installed &lt;a href=&quot;https://www.screenzen.co&quot;&gt;ScreenZen&lt;/a&gt;, an app designed to stop the doomscrolling and give you back your time and attention. I created a 15 minute limit per app per day, and it helped me reduce my screen time immensely. At the same time it&apos;s created a new habit and feeling that somehow might feel even worse.&lt;/p&gt;
&lt;p&gt;Now that I have a set limit per day, I &lt;em&gt;need&lt;/em&gt; to use every minute to its fullest &lt;em&gt;every&lt;/em&gt; day. During an app session there&apos;s no going to search up something that piqued my interest; I can&apos;t waste my precious seconds on that. Video or post not interesting enough? Swipe away fast to get to the next thing that may be more worth it. It&apos;s become a chore of maximizing as much entertainment as I can. And I feel &lt;em&gt;anxious&lt;/em&gt; going through this ritual every day.&lt;/p&gt;
&lt;p&gt;Why should I continue this?&lt;/p&gt;
&lt;h2&gt;What will I do instead?&lt;/h2&gt;
&lt;p&gt;I want to spend more time creating things. I find more fulfillment in the things I&apos;ve made. When I look back on years prior, I don&apos;t think I could tell you about one truly meaningful TikTok or Twitter post I saw. What I create is what comes to mind, and it defines my own self-image. I&apos;m going to write more and &lt;a href=&quot;/blog/ink-link/&quot;&gt;build&lt;/a&gt; more &lt;a href=&quot;/blog/perfect-keyboard&quot;&gt;things&lt;/a&gt; to nurture that.&lt;/p&gt;
&lt;p&gt;Of course, life can&apos;t be all output. My consumption needs a change of pace, too. Embarrassingly, I&apos;ll admit that I finally read a book in full for the first time since &lt;em&gt;middle school&lt;/em&gt; this month. And it was great! I want to read more books and watch more shows (yes, even &lt;em&gt;The Wire&lt;/em&gt;). I&apos;ll actually have a chance at remembering them in a few years.&lt;/p&gt;
&lt;h2&gt;Join me!&lt;/h2&gt;
&lt;p&gt;If you&apos;ve ever felt that same anxiety, that sense of time wasted on digital chores, consider this your sign.&lt;/p&gt;
&lt;hr /&gt;
&lt;section&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;I&apos;ll likely keep YouTube, but I&apos;ll disable the home page and recommendation bar. I&apos;ll only watch a few subscriptions and videos I find externally or specifically search for. &lt;a href=&quot;#fnref1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/section&gt;
</content:encoded></item><item><title>Building My Perfect Keyboard — The Plan</title><link>https://nick.winans.io/blog/perfect-keyboard/</link><guid isPermaLink="true">https://nick.winans.io/blog/perfect-keyboard/</guid><description>Perfecting every part of my Lily58 keyboard.</description><pubDate>Tue, 29 Jul 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;I use a wireless Lily58 (a split, ergonomic keyboard) at work, and it&apos;s great. I love the layout, the slim profile, and the long battery life. Recently, though, I built a Photon 65% keyboard from CannonKeys, and it&apos;s made me realize how hollow my work board sounds &lt;em&gt;and&lt;/em&gt; feels.&lt;/p&gt;
&lt;p&gt;It&apos;s time to change that!&lt;/p&gt;
&lt;h2&gt;Sound Comparison&lt;/h2&gt;
&lt;p&gt;Before I get into the details of my plan, I want to share a sound comparison to illustrate where I&apos;m starting from and what I&apos;m aiming for. I&apos;ve recorded the sound of my current Lily58, a Corne with an aluminum case, and the Photon.&lt;/p&gt;
&lt;p&gt;Here&apos;s how they sound:&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Board&lt;/th&gt;
&lt;th&gt;Sound Test&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href=&quot;https://typeractive.xyz/pages/build/lily58_choc?c=%257B%2522cases%2522%253A%2522navy-3dp%2522%252C%2522display_covers%2522%253A%2522navy-3dp-cutout%2522%252C%2522batteries%2522%253A%2522110mah-3-7v-lipo%2522%252C%2522headers%2522%253A%2522no-solder-view%2522%252C%2522microcontrollers%2522%253A%2522nice-nano%2522%252C%2522displays%2522%253A%2522nice-view%2522%252C%2522switches%2522%253A%2522kailh-choc-red%2522%252C%2522keycaps%2522%253A%2522sc-mbk-legend-ergo%2522%257D&amp;amp;e=%255B%255D&quot;&gt;Lily58&lt;/a&gt; (Current)&lt;/td&gt;
&lt;td&gt;&amp;lt;audio controls preload=&quot;metadata&quot; src=&quot;./lily58.mp3&quot;&amp;gt;&amp;lt;/audio&amp;gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href=&quot;https://typeractive.xyz/pages/build/corne_choc?c=%257B%2522cases%2522%253A%2522midnight-premium%2522%252C%2522display_covers%2522%253A%2522clear-acrylic%2522%252C%2522batteries%2522%253A%2522110mah-3-7v-lipo%2522%252C%2522headers%2522%253A%2522no-solder-view%2522%252C%2522microcontrollers%2522%253A%2522nice-nano%2522%252C%2522displays%2522%253A%2522nice-view%2522%252C%2522switches%2522%253A%2522kailh-choc-pink%2522%252C%2522keycaps%2522%253A%2522mbk-hypersonic%2522%257D&amp;amp;e=%255B%255D&quot;&gt;Corne Premium&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&amp;lt;audio controls preload=&quot;metadata&quot; src=&quot;./corne.mp3&quot;&amp;gt;&amp;lt;/audio&amp;gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href=&quot;https://cannonkeys.com/products/photon&quot;&gt;Photon&lt;/a&gt; (Target)&lt;sup&gt;&lt;a href=&quot;#fn1&quot;&gt;[1]&lt;/a&gt;&lt;/sup&gt;&lt;/td&gt;
&lt;td&gt;&amp;lt;audio controls preload=&quot;metadata&quot; src=&quot;./photon.mp3&quot;&amp;gt;&amp;lt;/audio&amp;gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;I hope you can hear a big difference. The Lily58 is hollow and the switch stabilizers rattle a lot. The Corne certainly sounds a bit more solid, but the switches are still shallow in sound profile. Then we have the Photon, wow! The sound is deep, crisp, and satisfying. It&apos;s got &lt;em&gt;thock&lt;/em&gt;!&lt;/p&gt;
&lt;h2&gt;The Plan&lt;/h2&gt;
&lt;p&gt;The first step is going to be picking the switches. The existing Kailh Choc v1 switches just don&apos;t sound or feel that great. This isn&apos;t surprising to anyone who&apos;s tried them. At the same time, the full-size MX-style switches in the Photon are too tall for a small ergo board like the Lily58, and they would make for a disproportionately tall build.&lt;/p&gt;
&lt;p&gt;As a middle ground, both the Choc v2 and the Gateron KS-33 (or Low Profile 2.0) switches are the clear options. Both are decently low-profile, but they should provide a better sound and feel than the Choc v1 switches. While I don&apos;t think we&apos;ll get to sounding as good as the Photon, we&apos;ll do our best.&lt;/p&gt;
&lt;p&gt;The Choc v2 switches can sound pretty good. I know a couple of people with the &lt;a href=&quot;https://www.lofree.co/products/lofree-flow-the-smoothest-mechanical-keyboard&quot;&gt;Lofree Flow&lt;/a&gt;, and I&apos;m pretty impressed with how the all POM Choc v2 switches sound. The KS-33 on the other hand are recommended more often, so I&apos;m willing to give them a try, too.&lt;/p&gt;
&lt;p&gt;Great switches are only one part of the equation. After picking my switches, I&apos;ll be designing a new aluminum chassis, which will give us a solid feel and deeper sound than the existing thin plastic case.&lt;/p&gt;
&lt;p&gt;A new PCB will also be part of the design. No working around a separate microcontroller board like the nice!nano. By integrating the wireless controller directly onto the board, the final build can be slimmer and more cohesive.&lt;/p&gt;
&lt;p&gt;I&apos;m excited.&lt;/p&gt;
&lt;h2&gt;What&apos;s Next?&lt;/h2&gt;
&lt;p&gt;I&apos;ve gone ahead and ordered a &lt;a href=&quot;https://www.gateron.com/products/gateron-low-profile-switches-series-tester-with-sample-set&quot;&gt;Gateron KS-33 switch tester&lt;/a&gt; and &lt;a href=&quot;https://kailhswitch.net/collections/choc-switch&quot;&gt;every Choc v2 switch&lt;/a&gt; I could find. In the next post, I&apos;ll share my thoughts on the switches and which ones I choose.&lt;/p&gt;
&lt;p&gt;I want to share every step of the process of building this keyboard. My goal isn&apos;t just to show off the final product, but to help you understand how each part fits together to create the best possible keyboard (for me, at least!).&lt;/p&gt;
&lt;hr /&gt;
&lt;section&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;It&apos;s pretty important to note that this is using TTC Venus switches, GMK keycaps, TX &quot;AP&quot; Stabilizers Rev. 4, and case foam. All of these make a difference to the sound profile. &lt;a href=&quot;#fnref1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/section&gt;
</content:encoded></item><item><title>I Made a Million Dollar Product from My Dorm Room</title><link>https://nick.winans.io/blog/nice-nano/</link><guid isPermaLink="true">https://nick.winans.io/blog/nice-nano/</guid><description>The story of the nice!nano; a wireless Pro Micro-compatible microcontroller board I made in my freshman year of college.</description><pubDate>Sun, 23 Mar 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;&lt;em&gt;This post shares the story of the &lt;a href=&quot;https://nicekeyboards.com/nice-nano/&quot;&gt;nice!nano&lt;/a&gt;; a wireless, Pro Micro-compatible microcontroller board I made in my freshman year of college. The nice!nano powers tens of thousands of keyboards, has inspired many, and changed my life.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Over my first winter break in college, I created what I called the &lt;a href=&quot;https://github.com/Nicell/Dissatisfaction-65&quot;&gt;Dissatisfaction65&lt;/a&gt;, a wireless 65% keyboard inspired by the Satisfaction75. I don&apos;t remember exactly why, but I wanted to try making a DIY wireless keyboard after having made a few wired ones. The Adafruit 32u4 Bluefruit LE microcontroller was used to accomplish wireless since the open-source QMK keyboard firmware supported Bluetooth with this specific board. The project looked great in the end, but its performance was &lt;em&gt;awful&lt;/em&gt;. The typing latency was nearly unusable, and it only lasted a few days on battery even with a huge battery inside.&lt;/p&gt;
&lt;p&gt;Seeing all the low-latency, long battery-life wireless products from companies like Logitech and Apple, I knew that something better was possible. In the next two months I dove into the world of wireless microcontrollers and DIY keyboards. I quickly learned that Nordic microchips were the hobbyist&apos;s choice and the Pro Micro format reigned as king for DIY keyboards. In my search I discovered three microcontrollers trying to fill the gap between the two: the &lt;a href=&quot;https://github.com/jpconstantineau/NRF52-Board/tree/b739f4d053c72c3307a3888611c3a73fe1c1b757&quot;&gt;BlueMicro&lt;/a&gt;, the &lt;a href=&quot;https://github.com/joric/nrfmicro/tree/cf9c53bb59e2bb070b8206cb768ca13510e35dc8&quot;&gt;nRFMicro&lt;/a&gt;, and the &lt;a href=&quot;https://github.com/sekigon-gonnoc/BLE-Micro-Pro/tree/4d61fb50ae3e1cce0928c20ccf690988b3119318&quot;&gt;BLE-Micro-Pro&lt;/a&gt;.&lt;sup&gt;&lt;a href=&quot;#fn1&quot;&gt;[1]&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Board&lt;/th&gt;
&lt;th&gt;Retail Cost&lt;/th&gt;
&lt;th&gt;Form Factor&lt;/th&gt;
&lt;th&gt;Open Source&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;BlueMicro&lt;/td&gt;
&lt;td&gt;N/A&lt;/td&gt;
&lt;td&gt;Too Large&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;nRFMicro&lt;/td&gt;
&lt;td&gt;N/A&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;BLE-Micro-Pro&lt;/td&gt;
&lt;td&gt;~$40&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;The BlueMicro&apos;s form factor meant that I couldn&apos;t build most Pro Micro keyboards since it would interfere. The BLE-Micro-Pro was pretty expensive, locked down, and only sold in Japan. The nRFMicro was pretty close. At first, I decided to modify the nRFMicro to fit my needs, but I soon realized my goals were a bit too ambitious, so I restarted from scratch.&lt;/p&gt;
&lt;h2&gt;The nice!nano was born&lt;/h2&gt;
&lt;p&gt;The weekend (yes, the whole thing was designed in a weekend) I created the nice!nano, I don&apos;t think I left my desk for more than sleeping and getting food from the dining hall maybe three times. It was just me, &lt;a href=&quot;https://www.kicad.org/&quot;&gt;KiCad&lt;/a&gt;, Nordic&apos;s Infocenter&lt;sup&gt;&lt;a href=&quot;#fn2&quot;&gt;[2]&lt;/a&gt;&lt;/sup&gt;, &lt;a href=&quot;https://github.com/joric/nrfmicro/wiki&quot;&gt;nRFMicro wiki&lt;/a&gt;, and the &lt;a href=&quot;https://learn.adafruit.com/introducing-the-adafruit-nrf52840-feather/downloads&quot;&gt;Adafruit nRF52840 Feather schematic&lt;/a&gt;. I put together the schematic and BOM, laid out the PCB, and routed (and re-routed) the connections. On the other side I came out with the thinnest Pro Micro compatible nRF52840 based board.&lt;/p&gt;
&lt;p&gt;Over the next week I created a name and found my PCB assembler. The name is based on my online username, &quot;Nicell&quot;. I wanted to continue the spirit of metric naming of the Pro Micro and came up with &quot;nice!nano&quot;. The stylized lower-case pixel font mark was created to sit atop the antenna. After reaching out to a few assemblers, the cheapest option for producing five was about $100. That was a lot to spend on what could&apos;ve easily been a broken design, but after a few days of meticulously re-reviewing my designs, I paid.&lt;sup&gt;&lt;a href=&quot;#fn3&quot;&gt;[3]&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/nano-v0.1.49mKzayw_ZPP4RR.webp&quot; alt=&quot;The first nice!nano with the charging LED on&quot; /&gt;&lt;/p&gt;
&lt;p&gt;A few weeks later the boards showed up at my door. I was both ecstatic and terrified they wouldn&apos;t work. As I plugged in my first one I closed my eyes, tensed up, and peeked. To my surprise and relief, they worked! Over the next couple of weeks I built a Lily58 with them and got a modified version of QMK working on it. In my testing I found the board could last a few weeks on a 110mAh battery. When comparing to the Dissatisfaction65 that lasted a few days on a 2,500mAh battery, we were looking at over a 100x improvement in power efficiency. I was elated, and I posted my fully wireless Lily58 on Reddit and &lt;a href=&quot;https://www.reddit.com/r/MechanicalKeyboards/comments/fzlfy8/fully_wireless_lily58_pro/&quot;&gt;it got quite a bit of interest&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/lily.40oOEibJ_Z24cSYS.webp&quot; alt=&quot;The Lily58&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Over the next few weeks my tiny Discord grew into a sizable community focused on wireless keyboard innovation. I launched an interest check for a group buy, made a few more refinements of the nice!nano, and then I was ready to launch the group buy in mid June.&lt;/p&gt;
&lt;h2&gt;Group buys are awful&lt;/h2&gt;
&lt;p&gt;As a college student, I didn&apos;t have the money to bank roll a purchase of 1,000 nice!nanos, so I ran a group buy pre-purchase. At the time I had set a minimum purchase amount of 200 pieces for the order to go through and a maximum of 1,000 because I didn&apos;t think I could handle more than that. I set the end date for a month later. In the end, it wasn&apos;t even open for a day.&lt;/p&gt;
&lt;p&gt;The sale went live on June 20th at 11am central. Within the first few minutes I had met my minimum purchase amount. I remember sitting in my childhood bedroom (thanks covid) on the Shopify dashboard watching orders pour in. It was an incredible feeling. Within just seven hours all 1,000 nice!nanos had been sold, ending the group buy. In the next two months I got all the product in and shipped out the 400+ unique orders with the help of my family.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/facebook-post.DQaTztmE_1d0k4z.webp&quot; alt=&quot;My mom&apos;s facebook post about the process&quot; /&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;My mom posted about the fulfillment process on Facebook. It was a family effort!&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;With the success of the group buy, you might wonder, what&apos;s so awful? Well, it was extremely stressful holding on to so many people&apos;s money without a physical product to back it yet. Along with PayPal holding half the funds of the group buy for a while, it was a bit terrifying. At the same time, group buys have caused the mechanical keyboard community a lot of strife with stolen funds and extremely delayed projects. When I see well-established stores that &lt;em&gt;should&lt;/em&gt; have capital running group buys, I can&apos;t help but shake my head. I decided shortly after this that I will never run a group buy again.&lt;/p&gt;
&lt;h2&gt;ZMK&lt;/h2&gt;
&lt;p&gt;Rewinding back a couple of months, as I was waiting for group buy product to come in, there was still a fairly major part of the ecosystem missing: decent firmware. I bounced between different existing options unsatisfied with the result. That was until I was connected with Pete Johanson who coincidentally had started working on a wireless keyboard firmware powered by the modern Zephyr RTOS.&lt;/p&gt;
&lt;p&gt;I quickly sent some pre-production units to Pete to mess around with. Shortly after, he got an early version of ZMK working on the nice!nano, and we hit the ground running building a new wireless-first firmware with a low-power focus. By early 2021 a small community led by Pete had created an extremely performant and feature-full wireless firmware.&lt;/p&gt;
&lt;h2&gt;Settling in&lt;/h2&gt;
&lt;p&gt;In 2021 I really settled in to my new small business. My vendor network was growing around the world, nice!nanos were flying off the shelves and hard to keep in stock, and the ZMK community continued to grow and strengthen. Other popular ZMK boards started to pop up around here with lots of them inspired by the nice!nano, or they were at least using my schematic, which I released publicly.&lt;/p&gt;
&lt;p&gt;Everything was looking great, but I noticed that most of my vendors did not carry all the parts needed for a wireless build, or that their builds focused on wired microcontrollers. I figured I could bring something new to this area.&lt;/p&gt;
&lt;h2&gt;Becoming the vendor&lt;/h2&gt;
&lt;p&gt;As a full time student, I knew I couldn&apos;t run a whole ecommerce store easily by myself. Luckily, my parents decided to retire at the end of 2021, and my dad was saying he needed something to keep himself busy. Together in 2022 we started &lt;a href=&quot;https://typeractive.xyz&quot;&gt;Typeractive&lt;/a&gt;, a keyboard store focused on the wireless keyboard experience.&lt;/p&gt;
&lt;p&gt;I created a &lt;a href=&quot;https://typeractive.xyz/pages/build&quot;&gt;3D interactive configuration tool&lt;/a&gt; for people to get all the parts they needed and kits specially designed for wireless boards. This low friction experience was a huge success, and now in 2025 we&apos;re one of the largest split keyboard stores. There&apos;s a lot more that happened with Typeractive, but I can tell that story another time.&lt;/p&gt;
&lt;h2&gt;Cloned, twice!&lt;/h2&gt;
&lt;p&gt;In 2023 the nice!nano was cloned, not once, but twice. Two different designed copies popped up on Taobao, and it wasn&apos;t long before they ended up on AliExpress and even on my existing vendors&apos; stores. I was a bit shocked by this, and in the end I found I couldn&apos;t do much about this.&lt;/p&gt;
&lt;p&gt;To be clear, these are &lt;em&gt;clones&lt;/em&gt;. I think competition is fair, but both of these new boards that popped up are advertised as nice!nanos and are shipped with the exact same firmware I use on the nice!nano, so when someone plugs it in, it says it&apos;s a nice!nano. If the manufacturers would have just built their own firmware (it&apos;s open source!) and not used the nice!nano in the title of their listings, I would say it&apos;s fair game.&lt;/p&gt;
&lt;p&gt;Seeing my product get cloned gave me mixed feelings. As everyone knows, imitation is the greatest form of flattery, but seeing them ride the coattails of my work was frustrating. At the end of the day though, their product is subpar, and nice!nanos continue to sell at a consistent rate. Some of that is likely due to the largest DIY wireless keyboard store not stocking them. Thanks, Typeractive!&lt;/p&gt;
&lt;h2&gt;The million dollar product&lt;/h2&gt;
&lt;p&gt;Ok, so the title is a bit of click-bait, but if you made it this far, I suppose it worked. The nice!nano might have been designed in my dorm room, but this was a multi-year journey. To date, over 50,000 nice!nanos have been sold at various online retailers around the world representing over a million dollars in sales. It&apos;s hard to wrap my head around still, and I&apos;m extremely grateful. While I put in a lot of hard work, I also recognize that timing and luck played a significant role. The growing interest in wireless keyboards and the lack of available options in the DIY space created the perfect environment for the nice!nano to thrive.&lt;/p&gt;
&lt;p&gt;Creating this post has been an incredible trip down memory lane. The nice!nano has had immeasurable impact on my life, and it only happened thanks to so many people that helped along the way. In semi-chronological order, I&apos;d like to shout out individuals that helped me immensely:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Joric (creator of the nRFMicro)&lt;/li&gt;
&lt;li&gt;Pierre Constantineau (creator of the BlueMicro board and firmware)&lt;/li&gt;
&lt;li&gt;Pete Johanson (creator of ZMK)&lt;/li&gt;
&lt;li&gt;Mike and Pam (my parents)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Thank you. It&apos;s been incredibly rewarding to see all the custom keyboards built with or derived from the nice!nano. The community is still growing, and I&apos;m glad that the nice!nano gets to be a big part of it.&lt;/p&gt;
&lt;hr /&gt;
&lt;section&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;I&apos;ve purposefully left links to each repository in the state I would have been reading them back in early 2020. &lt;a href=&quot;#fnref1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;As I wrote this, I found out that Nordic&apos;s Infocenter was shut down, RIP. &lt;a href=&quot;#fnref2&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I sometimes laugh at how scary that $100 purchase was for me at the time. All things considered, this is an extremely cheap R&amp;amp;D investment. &lt;a href=&quot;#fnref3&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/section&gt;
</content:encoded></item><item><title>eCommerce Automations</title><link>https://nick.winans.io/blog/ecommerce-automations/</link><guid isPermaLink="true">https://nick.winans.io/blog/ecommerce-automations/</guid><description>Sharing my Shopify store&apos;s automations after 3 years of optimization</description><pubDate>Sun, 23 Feb 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;I&apos;m the owner of &lt;a href=&quot;https://typeractive.xyz&quot;&gt;Typeractive.xyz&lt;/a&gt;, a DIY ergonomic keyboard kit store. We currently average 30 orders a day with 10 items per order. This is relatively high volume considering we only have one person full-time working on the store.&lt;sup&gt;&lt;a href=&quot;#fn1&quot;&gt;[1]&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;As opposed to other stores in the ergonomic keyboard kit space, we are rarely out of stock of our 250+ product variants, dispatch &amp;gt;99% of eligible orders in one working day, and respond to all customer requests in an average of five hours.&lt;sup&gt;&lt;a href=&quot;#fn2&quot;&gt;[2]&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;These achievements are only possible thanks to heavy automation of our daily operations, and after three years of running our store, I&apos;d like to share more details about which small automations have created the biggest impact.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Disclaimer: Our store runs on Shopify, so lots of the solutions are centered around some tools that are only available with Shopify.&lt;/em&gt;&lt;sup&gt;&lt;a href=&quot;#fn3&quot;&gt;[3]&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;h2&gt;Staying in Stock&lt;/h2&gt;
&lt;p&gt;After expanding the store to more variants, we grew quickly and immediately ran into a problem every store owner wishes for: how do we keep our products in stock?&lt;/p&gt;
&lt;h3&gt;Reorder Notifications&lt;/h3&gt;
&lt;p&gt;If I had to get rid of every other automation and keep just one, it&apos;s this. Every day a Cloudflare Worker calculates how many days of stock remain based on past sales, creates a list of products that need to be reordered, and posts it to our team&apos;s internal Discord.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/reorder.CWB69RN-_1WhroW.webp&quot; alt=&quot;Reorder notification example&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Something like this can be implemented in many different ways; here&apos;s some more details on ours.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Different products take different amounts of time to order and receive. We use Shopify &lt;a href=&quot;https://help.shopify.com/en/manual/custom-data/metafields&quot;&gt;Metafields&lt;/a&gt; to assign a &quot;Days to Reorder&quot; number to every product.&lt;/li&gt;
&lt;li&gt;Quantity sold per day is found using Shopify&apos;s &quot;&lt;a href=&quot;https://help.shopify.com/en/manual/reports-and-analytics/shopify-reports/report-types/default-reports/inventory-reports#days-of-inventory-remaining&quot;&gt;Days of inventory remaining&lt;/a&gt;&quot; report.&lt;/li&gt;
&lt;li&gt;Once ordered, we create a &lt;a href=&quot;https://help.shopify.com/en/manual/products/inventory/purchase-orders&quot;&gt;Purchase Order&lt;/a&gt; in Shopify, which causes that product to fall off our reorder notification keeping it neat and tidy.&lt;/li&gt;
&lt;li&gt;Neither Shopify&apos;s reports nor purchase orders are supported via their API. We access them programmatically by forging a Shopify admin client GraphQL request. You can learn more about forging requests &lt;a href=&quot;https://youtu.be/8GZPQUjd7pk?si=krxvkyiKMIDb1R1I&quot;&gt;from my friend David&lt;/a&gt;. Shopify, &lt;em&gt;please&lt;/em&gt; add these to your API.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I&apos;ve considered packaging this up as a simple Shopify App, but I haven&apos;t gotten around to it. If this might interest you, let me know!&lt;/p&gt;
&lt;h3&gt;3D Printing Dashboard&lt;/h3&gt;
&lt;p&gt;Our store also sells some 3D printed products we produce in-house. These products don&apos;t have significant lead time, but we can only produce a small amount each day using our three BambuLab printers.&lt;/p&gt;
&lt;p&gt;Our existing reorder notification didn&apos;t work as well since we should be continually printing and it would cause significant clutter considering we have over 100 3D printed variants. Instead, we have a dashboard showing which variants to prioritize printing.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/3d-dashboard.B557OgrI_19ItsB.webp&quot; alt=&quot;3D printing dashboard&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Once again, we use Metafields to define a per-variant &quot;Desired Stock&quot; value. We know some colors are more popular, so we want to have more ready to go. With this dashboard, instead of blindly scrolling through &amp;gt;100 variants figuring out what to print, we can just find which variants have the lowest desired stock level (percent). This reduces the amount of time it takes to start another print, which requires stepping away from fulfilling orders.&lt;/p&gt;
&lt;p&gt;This dashboard in particular is generated using &lt;a href=&quot;https://retool.com/&quot;&gt;Retool&lt;/a&gt; and Shopify&apos;s Admin GraphQL API. Retool works well, but I don&apos;t think it&apos;s incredible, and there&apos;s probably lots of other options out there. It&apos;s a shame Shopify&apos;s new &quot;&lt;a href=&quot;https://help.shopify.com/en/manual/reports-and-analytics/shopify-reports/new-analytics/explorations&quot;&gt;Explorations&lt;/a&gt;&quot; in Reports don&apos;t appear to support inventory queries like this in any significant capacity.&lt;/p&gt;
&lt;h3&gt;Back in Stock Notifications&lt;/h3&gt;
&lt;p&gt;As a last resort, if we can&apos;t keep items in stock, we have back in stock notifications. These are pretty standard. The customer signs up for notifications and receives an email when they&apos;re back in stock.&lt;/p&gt;
&lt;p&gt;Except I&apos;m a software engineer, so of course I looked at existing solutions and thought, &quot;I can do that better and cheaper&quot;. As is usually the case, that was only true if my time wasn&apos;t taken into account.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/back-in-stock.DA_PevyR_ZTpBIr.webp&quot; alt=&quot;Back in stock sign up modal&quot; /&gt;&lt;/p&gt;
&lt;p&gt;This was all built in a weekend and has cost $0. Here&apos;s how it&apos;s built.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A Firebase Firestore stores the requests with a timestamp, product variant ID, and an email.&lt;/li&gt;
&lt;li&gt;When a variant is back in stock, &lt;a href=&quot;https://help.shopify.com/en/manual/shopify-flow&quot;&gt;Shopify Flow&lt;/a&gt; sends a request to a Firebase functions.&lt;/li&gt;
&lt;li&gt;The function finds all unsent requests in Firestore and batch submits email documents using Firebase&apos;s &lt;a href=&quot;https://firebase.google.com/docs/extensions/official/firestore-send-email&quot;&gt;Trigger Email&lt;/a&gt; extension.&lt;/li&gt;
&lt;li&gt;Those emails are sent using our Google Workspace SMTP server.&lt;/li&gt;
&lt;li&gt;Templates are generated using &lt;a href=&quot;https://github.com/mjmlio/mjml&quot;&gt;MJML&lt;/a&gt; and stored in Firestore as well. &lt;a href=&quot;https://github.com/shellscape/jsx-email&quot;&gt;JSX Email&lt;/a&gt; looks like another good option.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;If I were rebuilding this today, I&apos;d probably use Cloudflare and Amazon SES. Firebase functions are unbelievably slow to deploy. At the time I built this, Cloudflare Email Workers weren&apos;t yet available.&lt;/p&gt;
&lt;h2&gt;Dispatching on Time&lt;/h2&gt;
&lt;p&gt;To dispatch our 30 orders a day on time, we need to save as much of Jack&apos;s (our full-time warehouse worker) time as possible.&lt;/p&gt;
&lt;h3&gt;Packing Slip Ordering&lt;/h3&gt;
&lt;p&gt;Our warehouse is laid out with our products along the walls in cubbies, &lt;a href=&quot;https://www.menards.com/main/search.html?search=stackable+bins&quot;&gt;stackable bins&lt;/a&gt;, and drawers. Normally, when a Shopify packing slip is printed out, the order is based on the order that the customer added the items to the cart. This isn&apos;t very helpful.&lt;/p&gt;
&lt;p&gt;For the final time in this post, I will once again point to Metafields. For every product variant, we set a five-digit integer, which orders it in our packing slip. Note that the sorting is alphabetical, so make sure to use all five digits.&lt;/p&gt;
&lt;p&gt;After setting up the Metafields we can add a few lines to our packing slip liquid template that will order everything how we want it.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;+{% assign packing_orders = order.line_items | map: &apos;variant&apos; | map: &apos;metafields&apos; | map: &apos;custom&apos; | map: &apos;packing_order&apos; | compact | sort | uniq %}
+{% for packing_order in packing_orders %}
{% for line_item in order.line_items %}
+{% if line_item.variant.metafields.custom.packing_order == packing_order %}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Finally, some code! We first get every unique line item packing order, and then we iterate through each packing order and each line item and only print out when the packing order matches. This is definitely inefficient, but it&apos;s fast enough and works well with liquid.&lt;/p&gt;
&lt;h3&gt;Packing Slip Notes&lt;/h3&gt;
&lt;p&gt;When filling an order, it&apos;s helpful to know the shipping method and box size. This informs both what box to use (size and whether we use a USPS Priority Mail box) and how things should be packed (international orders may require things like bubble wrap).&lt;/p&gt;
&lt;p&gt;To add the shipping method and box size to the order, I use Shopify Flow to update the note.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/order-note-flow.DPJqvb4L_Z1vXLHk.webp&quot; alt=&quot;Shopify Flow trigger, code, and action&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Whenever an order is created, we bucket the order into a box size by total shipment weight and then extract the shipping method.&lt;sup&gt;&lt;a href=&quot;#fn4&quot;&gt;[4]&lt;/a&gt;&lt;/sup&gt; Then the order note is set.&lt;/p&gt;
&lt;h3&gt;Box Inventory&lt;/h3&gt;
&lt;p&gt;I&apos;m really surprised that Shopify nor our current shipping platform support box inventories. Keeping track of box inventory is tedious and can cause huge delays in shipping if you forget to order or check on stock levels. We have all the data, yet it&apos;s not tracked!&lt;/p&gt;
&lt;p&gt;To track this, when we complete a shipment on our shipping platform, a webhook hits a Firebase function endpoint with the shipment payload. Using this payload, we use the box name to log a use of a box and update our inventory in a Firestore.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/box-inventory.C-eQcYmi_2k8Kcg.webp&quot; alt=&quot;Box Inventory Dashboard&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Once more we create a Retool dashboard that calculates the average box usage per day and tells us the amount of days left. At the end of every day, Jack takes a look at this dashboard and if any is under 30 days, we order more boxes and leave a note.&lt;/p&gt;
&lt;p&gt;Maybe at some point we can integrate this with our reorder notifications, but this works for now.&lt;/p&gt;
&lt;h3&gt;Multiple Order Flagging&lt;/h3&gt;
&lt;p&gt;Often times, customers forget an item and end up creating multiple orders in one day. We can save a lot on shipping by combining these orders, but manually going through the orders every day is time consuming.&lt;/p&gt;
&lt;p&gt;Amazingly, Shopify has a Flow &lt;a href=&quot;https://help.shopify.com/en/manual/shopify-flow/concepts/advanced-workflows/examples#when-a-new-order-is-created-check-if-the-same-customer-placed-other-orders-within-the-last-24-hours&quot;&gt;template&lt;/a&gt; exactly for this, and it works &lt;em&gt;great&lt;/em&gt;! Not much else to say on this besides it&apos;s awesome.&lt;/p&gt;
&lt;h2&gt;Quality Support, Fast&lt;/h2&gt;
&lt;p&gt;Admittedly, automating customer support is challenging and can result in a poor customer experience. Because of that, we don&apos;t actually automate much of our customer service. Instead, we try to automate everything else as much as possible to give Jack and Mike the time they need to properly handle requests.&lt;/p&gt;
&lt;p&gt;We also try to offer our customers as much documentation as possible to prevent unnecessary customer service requests. There&apos;s still a few things we do to lighten the burden.&lt;/p&gt;
&lt;h3&gt;Discord and an AI Bot&lt;/h3&gt;
&lt;p&gt;We have an incredible Discord &lt;a href=&quot;https://typeracitve.xyz/discord&quot;&gt;community&lt;/a&gt;. Members help each other, share their builds, and have created a true &lt;em&gt;community&lt;/em&gt;. When we ship our kits, we include a business card with a link to both our extensive documentation as well as our community Discord.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/business-card.Dhw8fGiC_qF2yc.webp&quot; alt=&quot;Business card&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Once a customer gets to Discord, they can create a post in our help forum. When they do this, they&apos;re greeted by an AI bot that has our entire documentation as a system prompt.&lt;sup&gt;&lt;a href=&quot;#fn5&quot;&gt;[5]&lt;/a&gt;&lt;/sup&gt; While the AI bot does not resolve all submissions, it provides instant acknowledgment and starts the support process.&lt;/p&gt;
&lt;p&gt;Mike and Jack monitor the Discord and will help how they can, but often the technical questions aren&apos;t in their wheelhouse. Other community members graciously help out, otherwise the questions will be forwarded to me, and I&apos;ll help as soon as I can.&lt;/p&gt;
&lt;h3&gt;NotebookLM&lt;/h3&gt;
&lt;p&gt;This is a recent addition to our toolbox, but NotebookLM acts similarly to our Discord AI bot while being very flexible. We can create new notebooks, drop in documentation, data, customer feedback, etc and then get insights or ask questions about that.&lt;/p&gt;
&lt;p&gt;I used this to analyze the results of our recent customer feedback giveaway, and the results were &lt;em&gt;really&lt;/em&gt; impressive. I read every response myself, but the summary was spot on and I was able to share it with the team.&lt;/p&gt;
&lt;h2&gt;Closing Thoughts&lt;/h2&gt;
&lt;p&gt;It&apos;s been fun to be a storefront owner and a software engineer. There&apos;s so many opportunities to spend a little time to make things efficient for our exact use case. It&apos;s felt like a superpower that&apos;s put us ahead of our competition.&lt;/p&gt;
&lt;p&gt;This just scratches the surface of the technology that powers our store. There&apos;s some other interesting pieces like our the 3D customizer tool (our main differentiator), our shipping integrations, and our B2B vendor order automations. I may write about these in the future, but they don&apos;t feel immediately useful to others.&lt;/p&gt;
&lt;p&gt;I hope you found this peek into our operations useful or at least interesting! What automations have you found most helpful in &lt;em&gt;your&lt;/em&gt; business? I&apos;d love to hear about them.&lt;/p&gt;
&lt;hr /&gt;
&lt;section&gt;
&lt;ol&gt;
&lt;li&gt;&lt;p&gt;I also have a full-time software engineering job, so I only spend a few hours a week on the storefront. Jack is our only full-time, in-person employee. We have a few part-time people working ~5 hours a week for support and item pre-packaging. &lt;a href=&quot;#fnref1&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We actually achieve a P25 response time of just &lt;em&gt;twelve minutes&lt;/em&gt;. Jack and Mike are incredible. &lt;a href=&quot;#fnref2&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I often see stores spend a lot of time setting up alternative self-hosted storefronts like WooCommerce, Vendure, Medusa, etc. While I&apos;m a huge advocate for self-hosting software (lots of our internal tools are self-hosted), I would highly recommend sticking to Shopify. Your main store expense, by far, will end up being payment processing. Shopify, at least in the US, is extremely competitive on payment processing, and the small monthly fee easily pays for itself in terms of stability, global speed, API and custom app capabilities, Shopify Flow, and more. &lt;a href=&quot;#fnref3&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Thank you Shopify for adding the &quot;Run code&quot; block! This used to be a &lt;em&gt;massive&lt;/em&gt; if/then/else flow tree. I can see it potentially replacing even more of my Firebase functions and Cloudflare Workers if they continue to expand its capabilities. &lt;a href=&quot;#fnref4&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Google Gemini is slept on. I&apos;m using the Flash 2.0 model, and it&apos;s one of the few models that actually seems to take the entire long context into account and doesn&apos;t miss instructions. It does all of this while being incredible cost effective. &lt;a href=&quot;#fnref5&quot;&gt;↩︎&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/section&gt;
</content:encoded></item><item><title>InkLink</title><link>https://nick.winans.io/blog/ink-link/</link><guid isPermaLink="true">https://nick.winans.io/blog/ink-link/</guid><description>Real-time Collaborative E-Paper Canvas</description><pubDate>Sun, 26 Jan 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;This week, I built a collaborative e-paper display as a gift that I’m calling InkLink.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/inklink.Cb8VRodL_Z1tP9sk.webp&quot; alt=&quot;InkLink on a desk&quot; /&gt;&lt;/p&gt;
&lt;p&gt;This fun conversation piece updates every three minutes based on what people have drawn on a web interface.&lt;/p&gt;
&lt;p&gt;The web interface supports real-time multi-user canvas drawing. Try it out at &lt;a href=&quot;https://inklink.winans.io&quot;&gt;inklink.winans.io&lt;/a&gt;! &lt;em&gt;(This isn&apos;t connected to the actual display, feel free to draw whatever you want)&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/web.COvU98ZW_Z1mGjgF.webp&quot; alt=&quot;InkLink web interface&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;How it works&lt;/h2&gt;
&lt;h3&gt;Hardware&lt;/h3&gt;
&lt;p&gt;The hardware side of things is pretty simple. The display is a 7.5-inch tri-color e-paper display from Waveshare. It&apos;s connected to an ESP32 microcontroller running CircuitPython that fetches the image from a server every three minutes.&lt;/p&gt;
&lt;h3&gt;Software&lt;/h3&gt;
&lt;p&gt;The website is a React app and a Bun server communicating real-time over WebSockets. The Bun server has its own server-side canvas thanks to &lt;a href=&quot;https://www.npmjs.com/package/@napi-rs/canvas&quot;&gt;@napi-rs/canvas&lt;/a&gt;. Each action is applied to the server canvas and then sent to all clients in real-time. The server also exposes a &lt;code&gt;/drawing.bmp&lt;/code&gt; endpoint that the ESP32 fetches every three minutes.&lt;/p&gt;
&lt;p&gt;Side note: This was my first time using Bun, and I’m impressed. Its built-in WebSocket support is fantastic, and the server uses just 10MB of RAM at idle.&lt;/p&gt;
&lt;h2&gt;Build your own&lt;/h2&gt;
&lt;p&gt;This project is simple to make and makes a great gift. It&apos;s open source and available on GitHub at &lt;a href=&quot;https://github.com/Nicell/InkLink&quot;&gt;github.com/Nicell/InkLink&lt;/a&gt; including a full build guide. All the parts are readily available and the total cost is around ~$45-$85 depending on where you source the parts. Feel free to make your own or contribute to the project!&lt;/p&gt;
</content:encoded></item><item><title>Making My Blog 3x Faster</title><link>https://nick.winans.io/blog/optimizing-blog-fonts/</link><guid isPermaLink="true">https://nick.winans.io/blog/optimizing-blog-fonts/</guid><description>Optimizing fonts to slash load times</description><pubDate>Tue, 27 Jun 2023 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Custom fonts are a great way to make your website stand out, but for simple static websites, they can be a huge performance bottleneck. Let me show you how I reduced the First Contentful Paint of my blog from &lt;strong&gt;2.5s to 0.9s, a nearly 3x improvement&lt;/strong&gt;, by optimizing fonts.&lt;/p&gt;
&lt;h2&gt;Font overview&lt;/h2&gt;
&lt;p&gt;On my blog, I use three fonts: Lexend Deca for headings, Geologica for the body text, and JetBrains Mono for code blocks. Then, I use the Google Fonts CDN to load them.&lt;/p&gt;
&lt;p&gt;Google Fonts offers a simple way to load fonts on your website. It generates a few &lt;code&gt;link&lt;/code&gt; tags that you add to your &lt;code&gt;head&lt;/code&gt; element. Here&apos;s the snippet my blog used:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&amp;lt;link rel=&quot;preconnect&quot; href=&quot;https://fonts.googleapis.com&quot;&amp;gt;
&amp;lt;link rel=&quot;preconnect&quot; href=&quot;https://fonts.gstatic.com&quot; crossorigin&amp;gt;
&amp;lt;link href=&quot;https://fonts.googleapis.com/css2?family=Geologica&amp;amp;family=JetBrains+Mono:wght@400;600&amp;amp;family=Lexend+Deca:wght@400;600;700;800;900&amp;amp;display=swap&quot; rel=&quot;stylesheet&quot;&amp;gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;h2&gt;Why is Google Fonts slow?&lt;/h2&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/before.D8n65sv__Z8rJhf.webp&quot; alt=&quot;Google PageSpeed Insights when using Google Fonts, 88 Performance, 2.5s FCP&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Above you can see the results of running &lt;a href=&quot;https://pagespeed.web.dev/&quot;&gt;Google PageSpeed Insights&lt;/a&gt; on my blog. It gives me a score of 88 for performance and a First Contentful Paint of 2.5s. That&apos;s not great for a basic static website.&lt;/p&gt;
&lt;p&gt;While Google Fonts is extremely easy to use, that simplicity comes at a cost. Because the fonts are hosted on a separate domain, the browser has to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Make a DNS request to resolve the domain&lt;/li&gt;
&lt;li&gt;Make an HTTP request to fetch the CSS definitions for the fonts&lt;/li&gt;
&lt;li&gt;Make another request to download the fonts from the CSS definitions&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;That&apos;s a lot of work! The DNS request can be slow, and the chained request of CSS and then font is very slow on mobile networks that have high latency.&lt;/p&gt;
&lt;h2&gt;Hosting Fonts Locally&lt;/h2&gt;
&lt;p&gt;By hosting the font locally, we avoid the DNS request and the chained request. This way, we&apos;re just doing the font download step, which is much faster.&lt;/p&gt;
&lt;p&gt;To host the fonts locally, we need to download the font files and add them to our project. I downloaded the base variable fonts from Google Fonts and added them to my project in the public &lt;code&gt;fonts&lt;/code&gt; directory.&lt;/p&gt;
&lt;p&gt;Then I added inline CSS to my &lt;code&gt;head&lt;/code&gt; element in the layout file that&apos;s used for every page:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;  &amp;lt;style&amp;gt;
    @font-face {
      font-family: &quot;Geologica&quot;;
      src: url(&quot;/fonts/Geologica.ttf&quot;) format(&quot;truetype&quot;);
      font-display: fallback;
    }
    @font-face {
      font-family: &quot;Lexend Deca&quot;;
      src: url(&quot;/fonts/LexendDeca.ttf&quot;) format(&quot;truetype&quot;);
      font-display: fallback;
    }
    @font-face {
      font-family: &quot;JetBrains Mono&quot;;
      src: url(&quot;/fonts/JetBrainsMono.ttf&quot;) format(&quot;truetype&quot;);
      font-display: fallback;
    }
  &amp;lt;/style&amp;gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;Quick Aside on &lt;code&gt;font-display&lt;/code&gt;&lt;/h3&gt;
&lt;p&gt;I also added &lt;code&gt;font-display: fallback&lt;/code&gt; to the CSS to make sure that the fallback font is used while the custom font is loading. This is slightly different from &lt;code&gt;font-display: swap&lt;/code&gt;. &lt;code&gt;fallback&lt;/code&gt; gives the custom font 100ms of blank loading time for the custom font to load before switching to the fallback font. &lt;code&gt;swap&lt;/code&gt; switches to the fallback font immediately and then switches to the custom font when it&apos;s loaded. I prefer &lt;code&gt;fallback&lt;/code&gt; because it usually avoids the flash of unstyled text that &lt;code&gt;swap&lt;/code&gt; can cause.&lt;/p&gt;
&lt;h3&gt;Local Font Results&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/basic-local.DK43Ntqq_1cPxaH.webp&quot; alt=&quot;Google PageSpeed Insights when using basic local fonts, 98 Performance, 0.9s FCP&quot; /&gt;&lt;/p&gt;
&lt;p&gt;As we can see, the results are much better! We get a score of 98 for performance and a First Contentful Paint of 0.9s. That&apos;s a nearly 3x improvement in First Contentful Paint already!&lt;/p&gt;
&lt;p&gt;There are still some improvements to be done, though. The default variable fonts we downloaded from Google Fonts are huge.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/basic-local-size.CiPhaxTf_ZOrgkA.webp&quot; alt=&quot;Total download 305 KB, fonts download 231 KB&quot; /&gt;&lt;/p&gt;
&lt;p&gt;The total download size of the page is 305 KB, which is pretty good. However, the fonts are 231 KB, which is over 75% of the total download size! This is resulting in a Largest Contentful Paint that hasn&apos;t changed much, and there&apos;s still layout shift as the fonts load.&lt;/p&gt;
&lt;h2&gt;Optimizing Local Fonts&lt;/h2&gt;
&lt;p&gt;As you might suspect, Google Fonts doesn&apos;t serve these original variable fonts to the browser. Instead, it serves a subset of the font, optimizes it, and compresses it. We can do the same thing to our local fonts to get the same benefits.&lt;/p&gt;
&lt;p&gt;We&apos;ll be using the &lt;code&gt;pyftsubset&lt;/code&gt; tool from &lt;code&gt;fonttools&lt;/code&gt; to subset and optimize the fonts. First, we need to install &lt;code&gt;fonttools&lt;/code&gt;:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;pip install fonttools brotli zopfli
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Then, we can run &lt;code&gt;pyftsubset&lt;/code&gt; on our fonts to subset and optimize them:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;pyftsubset\
  LexendDeca.ttf \
  --output-file=&quot;LexendDeca.woff2&quot; \
  --flavor=woff2 \
  --layout-features=&quot;kern,liga,clig,calt,ccmp,locl,mark,mkmk,\
  onum,pnum,smcp,c2sc,frac,lnum,tnum,subs,sups,\
  cswh,dlig,ss01,ss03,zero&quot;\
  --unicodes=&quot;U+0000-00FF,U+0131,U+0152-0153,U+02BB-02BC,\
  U+02C6,U+02DA,U+02DC,U+2000-206F,U+2074,U+20AC,\
  U+2122,U+2191,U+2193,U+2212,U+2215,U+FEFF,U+FFFD&quot;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Let&apos;s break down what each of these options are doing:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;code&gt;--flavor=woff2&lt;/code&gt; tells &lt;code&gt;pyftsubset&lt;/code&gt; to output a WOFF2 file, which is a compressed font format that&apos;s supported by all modern browsers.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;--layout-features&lt;/code&gt; lets us select the OpenType features we want to include in the font. We&apos;re selecting all the ones we should need.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;--unicodes&lt;/code&gt; lets us select the Unicode characters we want to include in the font. We&apos;re selecting the same range that Google Fonts uses for their Latin subset. You can change this depending on if you need to support other languages.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;em&gt;For further reading on why these options are being chosen, please check out the &lt;a href=&quot;https://markoskon.com/creating-font-subsets/&quot;&gt;amazing article&lt;/a&gt; from Markos Konstantopoulos that gives much more detail to this process.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Now that we have our subsetted and optimized font, we can update our CSS to use the WOFF2 file instead of the TTF file:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;@font-face {
  font-family: &quot;Lexend Deca&quot;;
  src: url(&quot;/fonts/LexendDeca.woff2&quot;) format(&quot;woff2&quot;);
  font-display: fallback;
}
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;After completing this, the home page fonts download size was reduced from 231 KB to 82 KB and the total download size was reduced from 305 KB to 153 KB.&lt;/p&gt;
&lt;h3&gt;Optimized Local Font Results&lt;/h3&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/optimized-local.DCWFDyAI_Z2vqsTm.webp&quot; alt=&quot;Google PageSpeed Insights when using optimized local fonts, 100 Performance, 0.9s FCP&quot; /&gt;&lt;/p&gt;
&lt;p&gt;And now we&apos;ve achieved the coveted 100 performance score! The reduction in the font download size helped us reduce the Largest Contentful Paint by a whole second and eliminated the layout shift entirely.&lt;/p&gt;
&lt;h2&gt;Optimize Your Fonts!&lt;/h2&gt;
&lt;p&gt;Fonts are often overlooked when optimizing the speed of a website, but they can have a huge impact, particularly on simple static websites. By taking the simple step of hosting the fonts locally and optimizing them, we&apos;re able to dramatically improve the performance of our website.&lt;/p&gt;
&lt;p&gt;Give it a try on your own website, and hopefully you&apos;ll see some impressive results!&lt;/p&gt;
</content:encoded></item><item><title>Exploiting GitHub Actions</title><link>https://nick.winans.io/blog/exploiting-github-actions/</link><guid isPermaLink="true">https://nick.winans.io/blog/exploiting-github-actions/</guid><description>Finding a way around GitHub&apos;s build matrix limits with GitHub Script</description><pubDate>Thu, 17 Mar 2022 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;I&apos;m a contributor to the open-source keyboard firmware &lt;a href=&quot;https://zmk.dev&quot;&gt;ZMK&lt;/a&gt;, which is a firmware that runs on custom, usually wireless keyboards. It turns electrical keypresses into real output to your computer with a bunch of cool features and customizations. In our project, we house widely available custom &quot;boards&quot; and &quot;shields&quot;. &quot;Boards&quot; are defined as the part of the keyboard that has the microprocessor running the firmware, and &quot;shields&quot; are the keyboard looking circuit board that a &quot;board&quot; plugs into. With every commit to the main repository and pull requests we build firmware for every single board and shield combination using GitHub Actions.&lt;/p&gt;
&lt;p&gt;For those unfamiliar, GitHub Actions is a Continuous Integration/Delivery. More simply, it allows us to run build scripts on every commit to make sure everything still works with the new changes committed. This is free to open source projects with some limitations, which we&apos;ll see soon.&lt;/p&gt;
&lt;h2&gt;The Problem&lt;/h2&gt;
&lt;p&gt;ZMK has grown in popularity, and today we&apos;re sitting at 13 boards and 44 shields, which is awesome. For our GitHub Actions workflows, we utilize what&apos;s called a build matrix to generate all of the different firmware files required. This allows us to set all the boards as one dimension and all the shields as the other, and then it builds every single combination possible, which is perfect... except for one thing. GitHub Actions limits matrix builds to 256 maximum jobs. Considering today we have 528 combinations, we exceed this limit, and it&apos;s not going to ever get better. We need a solution to be able to continue building every combination.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/current.DcVZ3Ji8_2caFJy.webp&quot; alt=&quot;Diagram of the current build solution, failing at the 256 limit&quot; /&gt;&lt;/p&gt;
&lt;h2&gt;Analyzing Possible Solutions&lt;/h2&gt;
&lt;p&gt;The first thing you might think is, why not just loop through every combination in one job instead of running a different job for each firmware file? This isn&apos;t a bad train of thought, but we have to consider our limitations.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/bash.fWRVZ9H1_Z1e9Llr.webp&quot; alt=&quot;Diagram of a bash script solution, it doesn&apos;t let us upload&quot; /&gt;&lt;/p&gt;
&lt;p&gt;We have to use basic bash scripting to complete these loops, and feeding these combinations via JSON/YAML is not an easy task. The other major issue is that we need to upload each build file as an artifact so users can find and download them. Unfortunately, there isn&apos;t an easy way to upload unique artifacts in a loop since you can&apos;t call the artifact upload action from inside a raw bash script.&lt;/p&gt;
&lt;p&gt;When I looked into this problem I first tried to remove the build matrix portion entirely. In my mind, there was no way of getting around the 256 limit. I turned to the GitHub Actions API, which has a way to invoke workflow runs with parameters. I thought this was perfect to run an unlimited amount of jobs since we&apos;re working outside of the matrix job limit and we can queue however many workflows as we want.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/actions-api.CDdtBn3j_252lr5.webp&quot; alt=&quot;Diagram of calling the GitHub Actions API, failing at security&quot; /&gt;&lt;/p&gt;
&lt;p&gt;This was looking like a good solution until I realized one flaw. To call these API endpoints we needed to pass a privileged API key into the GitHub Action script. While we can hide the key fairly effectively, the project is open source, and we run these scripts on every pull request making it an easy target to do other nefarious activities.&lt;/p&gt;
&lt;p&gt;With direct API calls off the table, I turned back to looping through every combination as a possible solution. This time I was trying to figure out how I can more easily walk through all the combinations and upload every file individually. I ultimately couldn&apos;t figure out a way to do it in bash scripting, and then I stumbled upon a GitHub action that solved all my problems by misusing it; GitHub Script.&lt;/p&gt;
&lt;h2&gt;GitHub Script&lt;/h2&gt;
&lt;p&gt;The GitHub Script action&apos;s tagline is &quot;Write workflows scripting the GitHub API in JavaScript&quot;. This action is &lt;em&gt;supposed&lt;/em&gt; to be used to interact with GitHub issues, pull requests, and other GitHub API elements. Crucially, it gives a sandboxed environment that&apos;s still privileged and contains the current workflow&apos;s context. This allows us to upload file artifacts arbitrarily whenever we want, including in loops.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/github-script.B2iizHr-_ZEeUnf.webp&quot; alt=&quot;Diagram of using GitHub script to loop, meeting all requirements&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Instead of using it to interact with GitHub API elements, I use it to run the same scripts we ran in the bash script to build each combination and then upload the file artifact, perfect!&lt;/p&gt;
&lt;p&gt;One issue with this solution though, is that we lost all parallelization. This means we have to build all 528 files one at a time, which is pretty slow. To improve upon this I decided to return the job matrix, but use it differently this time. I now use it to create a new job for each board, not for each combination. So we now create 13 board jobs that run all the possible combinations for that specific board using GitHub Script.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/github-script-parallel.BCwOXCh__Z1L5PdD.webp&quot; alt=&quot;Diagram of added parallelization by putting a matrix between the JSON shield list and the GitHub script loop&quot; /&gt;&lt;/p&gt;
&lt;p&gt;With this solution, we can build 528 firmware files just as fast as we used to build only 256 before. I&apos;d consider this a huge success that should last for years to come. That is until we get 256 boards...&lt;/p&gt;
&lt;h2&gt;Other Improvements&lt;/h2&gt;
&lt;p&gt;While solving this major problem for our build script, I also overhauled a couple of other parts of our build script that may interest you.&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;We used to manually write every board/shield in a list inside the build script to keep track of what combinations to build. This was an extra file to maintain and caused tons of merge issues as people were creating pull requests simultaneously. We&apos;ve transitioned to a system that reads in what boards and shields there are dynamically by walking the board/shield directory of the project.&lt;/li&gt;
&lt;li&gt;We used to build around 256 firmware files on every single commit no matter what the change was. It covered everything, but it was inefficient and caused hours of pileups of queued jobs. The new script checks whether the changes change a board/shield or change something core in the firmware. If there is a board/shield change, it builds every board/shield combination that&apos;s affected by the change. If there&apos;s a core change, it builds a short list of boards/shields that cover nearly every use case. This has cut down on almost all of our queued job issues and made our checks complete in a fraction of the time. We also build every combination nightly to make sure every combination has fresh firmware with all the new features.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;If you&apos;re curious to see what these changes looked like you can see the &lt;a href=&quot;https://github.com/zmkfirmware/zmk/pull/889&quot;&gt;pull request here&lt;/a&gt;.&lt;/p&gt;
</content:encoded></item><item><title>Mysterious Broken Bootloader</title><link>https://nick.winans.io/blog/zmk-bootloader/</link><guid isPermaLink="true">https://nick.winans.io/blog/zmk-bootloader/</guid><description>Investigating and fixing the bootloader woes of ZMK</description><pubDate>Sat, 03 Oct 2020 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;&lt;em&gt;This post was originally published on the &lt;a href=&quot;https://zmk.dev/blog/2020/10/03/bootloader-fix&quot;&gt;ZMK Firmware Blog&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Recently I was able to fix the &quot;stuck in the bootloader&quot; issue in &lt;a href=&quot;https://github.com/zmkfirmware/zmk/pull/322&quot;&gt;#322&lt;/a&gt; that had been plaguing us for quite some time. I want to go over what the issue was, how the issue was diagnosed, and how it was fixed.&lt;/p&gt;
&lt;h2&gt;Background&lt;/h2&gt;
&lt;p&gt;What exactly is the &quot;stuck in the bootloader&quot; issue? Seemingly randomly, users&apos; keyboards would suddenly stop working and when they would reset their keyboard they would get put into the bootloader instead of back into the firmware. This would require the user to re-flash the firmware again to get into the firmware. That wouldn&apos;t be so bad except for the fact that once this occurs, every reset would require the user to re-flash the firmware again. The only way to really fix this issue was to re-flash the bootloader itself, which is a huge pain.&lt;/p&gt;
&lt;p&gt;Going into this, all we knew was that this issue was most likely introduced somewhere in the &lt;a href=&quot;https://github.com/zmkfirmware/zmk/pull/133&quot;&gt;#133&lt;/a&gt;, which added Bluetooth profile management. We&apos;ve had quite a few attempts at trying to recreate the issue, but we never were able to get it to happen consistently.&lt;/p&gt;
&lt;h2&gt;Diagnosing the issue&lt;/h2&gt;
&lt;p&gt;This issue had been happening sporadically for the past month, and I finally
decided to dig in to see what was going on. We started in the Discord and
discussed what was common between all of the people who have experienced this
issue. Everyone who had this issue reported that they did quite a bit of profile
switching. This lined up with the possible connection to the Bluetooth profile
management pull request.&lt;/p&gt;
&lt;h3&gt;Pinpointing the cause&lt;/h3&gt;
&lt;p&gt;I had a hunch that this was related to the settings system. The settings system
is used by profile Bluetooth switching, and the settings system works directly
with the system flash. Based on this hunch, I tried spamming the RGB underglow
cycle behavior on my main keyboard. Sure enough after a couple minutes, I got
stuck in the bootloader. I was even able to reproduce it again.&lt;/p&gt;
&lt;p&gt;This was an important discovery for two reasons. First, I was able to recreate
the issue consistently, which meant I could set up logging and more closely
monitor what the board was doing. Second, this more or less proved that it was
specifically the settings system at fault. Both Bluetooth profile switching and
RGB underglow cycling trigger it, and the one common piece is they save their
state to settings.&lt;/p&gt;
&lt;h3&gt;Settings system overview&lt;/h3&gt;
&lt;p&gt;To understand what&apos;s going wrong, we first need to understand how the settings
system works. Here&apos;s a diagram to explain the flash space that the settings
system holds for our nRF52840 based boards (nice!nano, nRFMicro, BlueMicro).&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/settings-diagram.CxZblHeI_ZaRcgV.webp&quot; alt=&quot;Settings Diagram&quot; /&gt;&lt;/p&gt;
&lt;p&gt;The settings flash space lives at the end of the flash of the chip. In this case
it starts at &lt;code&gt;0xF8000&lt;/code&gt; and is &lt;code&gt;0x8000&lt;/code&gt; bytes long, which is 32KB in more
comprehensible units. Then due to the chip&apos;s architecture, this flash space is
broken into pages, which are &lt;code&gt;0x1000&lt;/code&gt; bytes in size (4KB).&lt;/p&gt;
&lt;p&gt;The backend that carries out the settings save and read operation in ZMK is
called NVS. NVS calls these pages sectors. Due to how flash works, you can&apos;t
write to the same bytes multiple times without erasing them first, and to erase
bytes, you need to erase the entire sector of flash. This means when NVS writes
to the settings flash if there&apos;s no erased space available for the new value, it
will need to erase a sector.&lt;/p&gt;
&lt;h3&gt;Logging discoveries&lt;/h3&gt;
&lt;p&gt;So first I enabled logging of the NVS module by adding
&lt;code&gt;CONFIG_NVS_LOG_LEVEL_DBG=y&lt;/code&gt; to my &lt;code&gt;.conf&lt;/code&gt; file. I repeated the same test of
spamming RGB underglow effect cycle and the resulting logs I got were this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;[00:00:00.000,671] &amp;lt;inf&amp;gt; fs_nvs: 8 Sectors of 4096 bytes
[00:00:00.000,671] &amp;lt;inf&amp;gt; fs_nvs: alloc wra: 3, f70
[00:00:00.000,671] &amp;lt;inf&amp;gt; fs_nvs: data wra: 3, f40
// A bunch of effect cycle spam
[00:02:34.781,188] &amp;lt;dbg&amp;gt; fs_nvs: Erasing flash at fd000, len 4096
// A bunch more effect cycle spam
[00:06:42.219,970] &amp;lt;dbg&amp;gt; fs_nvs: Erasing flash at ff000, len 4096
// A bunch more effect cycle spam
// KABOOM - bootloader issue
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;So at start up, we can see that the 8 sectors of 4KB are found by NVS properly,
however, I wasn&apos;t sure what the second and third lines meant, but we&apos;ll get back
to that. Nonetheless the next two logs from NVS showed erasing the sector at
&lt;code&gt;0xFD000&lt;/code&gt; and then erasing the &lt;code&gt;0xFF000&lt;/code&gt; sector.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/erased-sectors.D-tOQiDv_ktWsf.webp&quot; alt=&quot;Erased Sectors&quot; /&gt;&lt;/p&gt;
&lt;p&gt;It&apos;s really odd that the third to last sector and the last sector are erased,
and then shortly after the bootloader issue is hit. I really had no explanation
for this behavior.&lt;/p&gt;
&lt;h3&gt;Reaching out to Zephyr&lt;/h3&gt;
&lt;p&gt;At this point, I nor anyone else working on the ZMK project knew enough about
NVS to explain what was going on here. &lt;a href=&quot;https://github.com/petejohanson&quot;&gt;Pete
Johanson&lt;/a&gt;, project founder, reached out on the
Zephyr Project&apos;s Slack (ZMK is built on top of Zephyr if you weren&apos;t aware).
Justin B and Laczen assisted by first explaining that those &lt;code&gt;alloc wra&lt;/code&gt; and
&lt;code&gt;data wra&lt;/code&gt; logs from earlier are showing what data NVS found at startup.&lt;/p&gt;
&lt;p&gt;More specifically, &lt;code&gt;data wra&lt;/code&gt; should be &lt;code&gt;0&lt;/code&gt; when it first starts up on a clean
flash. As we can see from my earlier logging on a clean flash I was instead
getting &lt;code&gt;f40&lt;/code&gt;. NVS is finding data in our settings sectors when they should be
blank! We were then given the advice to double check our bootloader.&lt;/p&gt;
&lt;h3&gt;The Adafruit nRF52 Bootloader&lt;/h3&gt;
&lt;p&gt;Most of the boards the contributors of ZMK use have the &lt;a href=&quot;https://github.com/adafruit/Adafruit_nRF52_Bootloader&quot;&gt;Adafruit nRF52
Bootloader&lt;/a&gt;, which allows
for extremely easy flashing by dragging and dropping &lt;code&gt;.uf2&lt;/code&gt; files onto the board
as a USB drive. Every bootloader takes up a portion of the flash, and in the
README explains that the first &lt;code&gt;0x26000&lt;/code&gt; is reserved for the bootloader with the
nRF52840, and we&apos;ve properly allocated that.&lt;/p&gt;
&lt;p&gt;However, there isn&apos;t a full explanation of the flash allocation of the
bootloader in the README. There&apos;s a possibility that the bootloader is using
part of the same flash area we&apos;re using. I reached out on the Adafruit Discord,
and &lt;a href=&quot;https://github.com/dhalbert&quot;&gt;Dan Halbert&lt;/a&gt; pointed me towards the &lt;a href=&quot;https://github.com/adafruit/Adafruit_nRF52_Bootloader/blob/master/linker/nrf52840.ld&quot;&gt;linker
map&lt;/a&gt;
of the nRF52840. Let&apos;s take a look.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;FLASH (rx) : ORIGIN = 0xF4000, LENGTH = 0xFE000-0xF4000-2048 /* 38 KB */

BOOTLOADER_CONFIG (r): ORIGIN = 0xFE000 - 2048, LENGTH = 2048

/** Location of mbr params page in flash. */
MBR_PARAMS_PAGE (rw) : ORIGIN = 0xFE000, LENGTH = 0x1000

/** Location of bootloader setting in flash. */
BOOTLOADER_SETTINGS (rw) : ORIGIN = 0xFF000, LENGTH = 0x1000
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Here&apos;s a diagram to show this a bit better.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://nick.winans.io/_astro/adafruit-bootloader-diagram.C4roh46H_9OtnR.webp&quot; alt=&quot;Adafruit Bootloader Diagram&quot; /&gt;&lt;/p&gt;
&lt;p&gt;We&apos;ve found the issue! As you can see from the red bar (representing our
settings flash area), we&apos;ve put the settings flash area &lt;em&gt;right on top&lt;/em&gt; of the
Adafruit bootloader&apos;s flash space. Oops!&lt;/p&gt;
&lt;p&gt;This also shines some light on why NVS erased &lt;code&gt;0xFD000&lt;/code&gt; and &lt;code&gt;0xFF000&lt;/code&gt; sectors.
It&apos;s possible there was no flash written to &lt;code&gt;0xFD000&lt;/code&gt; because the bootloader
didn&apos;t use up all of that space it has, and then there possibly weren&apos;t any
bootloader settings set yet, so &lt;code&gt;0xFF000&lt;/code&gt; could be used and erased by NVS too.&lt;/p&gt;
&lt;p&gt;After erasing &lt;code&gt;0xFF000&lt;/code&gt;, NVS probably next erased a rather important part of the
bootloader that resulted in this issue at hand. In my opinion, we&apos;re pretty
lucky that it didn&apos;t delete an even more vital part of the bootloader. At least
we could still get to it, so that we could re-flash the bootloader easily!&lt;/p&gt;
&lt;h2&gt;The solution&lt;/h2&gt;
&lt;p&gt;Now that we&apos;ve found the issue, we can pretty easily fix this. We&apos;ll need to
move the settings flash area back so that it doesn&apos;t overlap with the
bootloader. First we calculate the size of the of flash area the bootloader is using.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;0x100000 (end of flash) - 0x0F4000 (start of bootloader) = 0xC000 (48KB)
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;So the bootloader is using the last 48KB of the flash, this means all we need to
do is shift back the settings area and code space &lt;code&gt;0xC000&lt;/code&gt; bytes. We&apos;ll apply
this to all of the &lt;code&gt;.dts&lt;/code&gt; files for the boards that were affected by this issue.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;		code_partition: partition@26000 {
			label = &quot;code_partition&quot;;
-			reg = &amp;lt;0x00026000 0x000d2000&amp;gt;;
+			reg = &amp;lt;0x00026000 0x000c6000&amp;gt;;
		};


-		storage_partition: partition@f8000 {
+		storage_partition: partition@ec000 {
			label = &quot;storage&quot;;
-			reg = &amp;lt;0x000f8000 0x00008000&amp;gt;;
+			reg = &amp;lt;0x000ec000 0x00008000&amp;gt;;
		};
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;And with those changes, we should no longer run into this issue! In the process
of these changes, we lost 48KB of space for application code, but we&apos;re only
using around 20% of it anyways. 🎉&lt;/p&gt;
</content:encoded></item></channel></rss>