<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Hyo on Medium]]></title>
        <description><![CDATA[Stories by Hyo on Medium]]></description>
        <link>https://medium.com/@hyodotdev?source=rss-6de083ee3256------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Thu, 09 Apr 2026 00:54:19 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@hyodotdev/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[LangChain for Mobile, Entirely On-Device — Meet Locanara]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/langchain-for-mobile-entirely-on-device-meet-locanara-33112ade3b0e?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1648/1*AFz_w6O6n954uIb4qoPXeg.png" width="1648"></a></p><p class="medium-feed-snippet">Building LangChain for Mobile: How We Designed an On-Device AI Framework for iOS and Android</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/langchain-for-mobile-entirely-on-device-meet-locanara-33112ade3b0e?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/langchain-for-mobile-entirely-on-device-meet-locanara-33112ade3b0e?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/33112ade3b0e</guid>
            <category><![CDATA[apple-intelligence]]></category>
            <category><![CDATA[ai-on-device]]></category>
            <category><![CDATA[on-device-llm]]></category>
            <category><![CDATA[gemini-nano]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Thu, 19 Feb 2026 23:08:27 GMT</pubDate>
            <atom:updated>2026-02-19T23:39:40.691Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Moltbot: A Bold Proposal for Where AI Should Live — and Why It’s Not Ready Yet]]></title>
            <link>https://hyodotdev.medium.com/moltbot-a-bold-proposal-for-where-ai-should-live-and-why-its-not-ready-yet-9e9d60a86e6c?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/9e9d60a86e6c</guid>
            <category><![CDATA[moltbot]]></category>
            <category><![CDATA[clawdbot]]></category>
            <category><![CDATA[claude]]></category>
            <category><![CDATA[ai-agent]]></category>
            <category><![CDATA[agents]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Fri, 30 Jan 2026 14:27:38 GMT</pubDate>
            <atom:updated>2026-01-30T14:27:38.848Z</atom:updated>
            <content:encoded><![CDATA[<p>Less than a week has passed, and once again the tech world is buzzing with news of a new AI agent.</p><p>In this article, I want to take a closer look at one of the most talked-about projects recently: <strong>Clawdbot</strong> — now reborn as <strong>Moltbot</strong> after a trademark complaint from Anthropic forced a name change.<br>As the name suggests, it has shed its old skin. But has it truly become stronger?</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*azqBr_EULo9kd_CDUZaE9w.png" /></figure><p>Right now, communities like X and Reddit are on fire over Moltbot. This AI runs <strong>24/7</strong>, all year round. One viral post claimed that a user entrusted the bot with <strong>$1 million</strong> in capital to manage. The bot reportedly analyzed over <strong>3,000 reports in real time</strong>, charted every technical indicator, and executed stock trades nonstop.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*FBdPhbo8H_Yn0fTkjKO4_w.png" /></figure><p>The result?<br> All the money was lost — but, as the post jokingly put it, <em>“the process was beautiful.”</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*CBgifUqKaUKe2K7h2GIvYw.png" /></figure><p>At this very moment, people are posting screenshots of Moltbot automatically sending affectionate messages to their girlfriends, writing complex code, or trading stocks on their behalf. Yet the developer has been clear: this started as a <strong>personal hobby project</strong>, meant to inspire people — not a magic tool to make them rich.</p><p>Despite that, many have misunderstood it as a shortcut to becoming a millionaire. Some even claim there’s a <strong>Mac mini shortage</strong> because people are buying them in bulk just to run Moltbot 24/7.</p><p>So what exactly is this thing? And why are people reacting this way?</p><p>Let’s break it down — and then look at it honestly from a developer’s perspective.</p><h3>Installation &amp; Setup (Quick Notes)</h3><p>Since many readers have already heard about Moltbot, I’ll keep this short.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*1UsoG182gnbyMWfeL1V4Cg.png" /></figure><p>One small tip: I initially tried installing it via <strong>npm</strong>, but ran into a missing CLI error. Reinstalling via <strong>curl</strong> worked without issues.</p><p>During setup, you’re greeted with a rather alarming warning: the bot can control your file system. This is not something to click through lightly. Read it carefully before proceeding.</p><p>You then enter onboarding, where you choose a model. Instead of entering an API key (and risking surprise billing later), I opted for <strong>Google OAuth</strong>, since I already subscribe to Gemini. I selected <strong>Gemini 2.5 Pro</strong> for better performance.</p><p>You’re asked about channels, skills, hooks, gateways, and execution environments. Many of these involve fairly powerful permissions. I skipped most of them for now and chose to run the bot in the <strong>terminal UI</strong>, which the project itself recommends.</p><p>Once launched, a browser window opens alongside the terminal. The bot asks for your name — and from there, you can literally watch it <strong>store and remember information about you in real time</strong>.</p><p>That’s enough for the walkthrough. Now let’s talk about what actually matters.</p><h3>1. The Real Innovation: Not Technology, but Positioning</h3><p>Moltbot’s GitHub stars are skyrocketing, and there’s a reason.<br> But if we strip things down to their technical essence, Moltbot is <strong>not fundamentally new technology</strong>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*NcqF9uWuNvk_aU7yDeG9xA.png" /></figure><p>Calling LLMs, storing state, connecting external tools — all of this can already be done with Claude Skills, ChatGPT Actions, LangChain, or Zapier. Even the so-called “gateway” is, in practice, just a long-running bot with webhooks and a scheduler.</p><p>So why are people so impressed?</p><p>Because the innovation isn’t <em>how</em> it works — it’s <strong>where it lives</strong>.</p><p>Traditional AI tools like ChatGPT require <em>you</em> to open an app and initiate work. You are always in control. Moltbot, on the other hand, <strong>lives inside your messenger</strong> — WhatsApp, Telegram, the same place you already spend your day. It’s always on. Always present.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*k9hXm4Sv_wD5D78Yjf1TrQ.png" /></figure><p>If ChatGPT suddenly started messaging you first, it would feel like spam.<br> But Moltbot feels like a personal assistant.</p><p>The technology is similar. The <strong>UX and power dynamics are not</strong>.</p><p>Who controls the conversation?<br> Who is always running?</p><p>This shift — the idea of an AI agent that <em>lives where you live</em> — is the first reason people are excited.</p><h3>The One Genuine Technical Differentiator: Disk-Based Memory</h3><p>There <em>is</em>, however, one truly meaningful technical distinction.</p><p>Moltbot uses the <strong>local hard disk as long-term memory</strong>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*TFXPMIbtcodKXHQaiLWD6g.png" /></figure><p>Most cloud AIs eventually forget earlier context as conversations grow longer. Moltbot stores conversations and state as files and reloads them later. As long as your disk has space, memory is effectively unlimited.</p><p>This disk-backed, persistent memory model is genuinely hard for traditional cloud services to replicate — and it is Moltbot’s strongest technical advantage.</p><h3>2. Model Limitations and the Refactoring Hell Problem</h3><p>Despite its innovative positioning, there are serious concerns from a professional developer’s perspective.</p><h3>Memory Is Not Truth</h3><p>Many people assume that because the bot “remembers,” it will automatically improve over time. That’s a dangerous misunderstanding.</p><p>Without a <strong>Single Source of Truth</strong>, persistent memory simply preserves <em>mistakes</em> along with correct information.<br> If your system doesn’t have a top-level constitution — immutable rules that override everything else — the AI will faithfully repeat bad assumptions forever.</p><p>Right now, Moltbot lacks this.</p><h3>The Inability to Stop</h3><p>This is the most critical issue.</p><p>Agents like Moltbot suffer from the same fundamental problem that plagued <strong>Auto-GPT</strong>: once they start down a wrong path, they <strong>don’t know when to stop</strong>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*RhSdgZ9ZDlDAb8-SgbMwWA.png" /></figure><p>Humans feel uncertainty. When something feels off, we pause, reconsider, or ask for help. Current LLM-based agents don’t. If the initial assumption is wrong, they keep building on it, refactoring again and again — until the entire project collapses into a mess.</p><p>This isn’t a Moltbot-specific design flaw. It’s a limitation of today’s models.<br> They lack robust uncertainty detection and the ability to say, <em>“I’m not sure — I need human input.”</em></p><p>The result is what many developers have experienced firsthand: <strong>refactoring hell</strong>.</p><p>In short, Moltbot is an excellent proposal for <em>where</em> AI should live — but the underlying model maturity isn’t there yet for it to be a primary development tool. For now, it should be treated as a powerful <strong>experimental platform</strong>, not production infrastructure.</p><h3>3. About the Mac Mini Frenzy</h3><p>Finally, let’s talk hardware.</p><p>The model limitations I mentioned will improve over time. And when they do, we’ll move away from cloud APIs toward <strong>local, always-on intelligence</strong>. That’s the future Moltbot hints at.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7qr9WbuoZPacMmSgMJla5g.png" /></figure><p>For that reason, I don’t recommend buying a base-model Mac mini <em>just</em> to run Moltbot today.</p><p>But if your goal is to someday run <strong>Gemini-2.5-Pro-level intelligence locally</strong>, without worrying about usage fees, that’s a different story.</p><p>In that future, the most important spec won’t be CPU speed.</p><p>It will be <strong>memory</strong>.</p><p>I strongly believe that machines with <strong>at least 64GB of RAM</strong> will become the foundation of personal AI infrastructure. Prepare for that future, and you’ll be ready when the models finally catch up.</p><p>Thanks for reading.<br> If you have thoughts or questions, I’d love to continue the discussion in the comments.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=9e9d60a86e6c" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Why did Anthropic allow “other people’s models” in Claude Code?]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/why-did-anthropic-allow-other-peoples-models-in-claude-code-b0a050b15fec?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1692/1*3iin8bATR2sRjOAEpQNbbA.png" width="1692"></a></p><p class="medium-feed-snippet">How &#x201C;technical permission&#x201D; became a strategy to own the interface, not just the model.</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/why-did-anthropic-allow-other-peoples-models-in-claude-code-b0a050b15fec?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/why-did-anthropic-allow-other-peoples-models-in-claude-code-b0a050b15fec?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/b0a050b15fec</guid>
            <category><![CDATA[anthropic-claude]]></category>
            <category><![CDATA[open-code]]></category>
            <category><![CDATA[claude]]></category>
            <category><![CDATA[ollama]]></category>
            <category><![CDATA[claude-code]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Wed, 21 Jan 2026 16:40:51 GMT</pubDate>
            <atom:updated>2026-01-23T12:51:31.599Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[AI Is Dumb, So I Code Everything Myself]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/ai-is-dumb-so-i-code-everything-myself-6e6e3004351e?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1146/1*5sIAkQ8UrG0oIRDOrU6gjw.png" width="1146"></a></p><p class="medium-feed-snippet">&#x201C;AI Is Dumb, So I Code Everything Myself.&#x201D;</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/ai-is-dumb-so-i-code-everything-myself-6e6e3004351e?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/ai-is-dumb-so-i-code-everything-myself-6e6e3004351e?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/6e6e3004351e</guid>
            <category><![CDATA[llm]]></category>
            <category><![CDATA[opensource-ai]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[software-engineering]]></category>
            <category><![CDATA[software-development]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Mon, 19 Jan 2026 01:23:57 GMT</pubDate>
            <atom:updated>2026-01-21T10:11:23.554Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Running AI Locally on Apple Silicon with MLX]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/running-ai-locally-on-apple-silicon-with-mlx-6e6b29ee10cf?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1920/1*OJAqA4lcozw0v_8mW2w2Uw.jpeg" width="1920"></a></p><p class="medium-feed-snippet">When people talk about AI today, the first things that usually come to mind are cloud APIs and massive GPU servers. Models live somewhere&#x2026;</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/running-ai-locally-on-apple-silicon-with-mlx-6e6b29ee10cf?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/running-ai-locally-on-apple-silicon-with-mlx-6e6b29ee10cf?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/6e6b29ee10cf</guid>
            <category><![CDATA[llm]]></category>
            <category><![CDATA[on-device-llm]]></category>
            <category><![CDATA[ai-on-device]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[mlx]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Wed, 07 Jan 2026 22:00:00 GMT</pubDate>
            <atom:updated>2026-01-07T22:00:12.910Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Building In-App Purchases for Godot Engine: OpenIAP Unlocked]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/building-in-app-purchases-for-godot-engine-the-openiap-journey-112c98c765fd?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1200/1*oV0gwf1BIe58Kuxf0tTA9A.jpeg" width="1200"></a></p><p class="medium-feed-snippet">How we brought a unified IAP experience to Godot, inspired by years of cross-platform development</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/building-in-app-purchases-for-godot-engine-the-openiap-journey-112c98c765fd?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/building-in-app-purchases-for-godot-engine-the-openiap-journey-112c98c765fd?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/112c98c765fd</guid>
            <category><![CDATA[godot-engine]]></category>
            <category><![CDATA[in-app-purchase]]></category>
            <category><![CDATA[openiap]]></category>
            <category><![CDATA[godot]]></category>
            <category><![CDATA[mobile-development]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Thu, 01 Jan 2026 01:07:58 GMT</pubDate>
            <atom:updated>2026-01-01T14:38:17.279Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[kstyled: Compile-Time styled-components for React Native (with Zero Runtime Cost)]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/kstyled-compile-time-styled-components-for-react-native-with-zero-runtime-cost-e1015a2f2440?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/1536/0*4ybgc7o1ZcyyS6Yh.png" width="1536"></a></p><p class="medium-feed-snippet">When you build a React Native app that actually ships to users, styling becomes more than &#x201C;just CSS&#x201D;.</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/kstyled-compile-time-styled-components-for-react-native-with-zero-runtime-cost-e1015a2f2440?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/kstyled-compile-time-styled-components-for-react-native-with-zero-runtime-cost-e1015a2f2440?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/e1015a2f2440</guid>
            <category><![CDATA[styled]]></category>
            <category><![CDATA[stylesheets]]></category>
            <category><![CDATA[react-native]]></category>
            <category><![CDATA[css]]></category>
            <category><![CDATA[styled-components]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Thu, 04 Dec 2025 15:50:50 GMT</pubDate>
            <atom:updated>2025-12-04T16:28:01.339Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Ending the In-App Purchases Hell: From Native SDKs to Cross-Platform Libraries for React Native…]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/dooboolab/ending-the-in-app-purchases-hell-from-native-sdks-to-cross-platform-libraries-for-react-native-14c18fa3436c?source=rss-6de083ee3256------2"><img src="https://cdn-images-1.medium.com/max/896/1*JoCArEUj2WDoccY5EN92Ew.png" width="896"></a></p><p class="medium-feed-snippet">TL;DR
I spent 2025 rebuilding every IAP library I&#x2019;ve maintained for years&#x200A;&#x2014;&#x200A;React Native, Expo, Flutter&#x200A;&#x2014;&#x200A;and realized the ecosystem was&#x2026;</p><p class="medium-feed-link"><a href="https://medium.com/dooboolab/ending-the-in-app-purchases-hell-from-native-sdks-to-cross-platform-libraries-for-react-native-14c18fa3436c?source=rss-6de083ee3256------2">Continue reading on Hyo Dev »</a></p></div>]]></description>
            <link>https://medium.com/dooboolab/ending-the-in-app-purchases-hell-from-native-sdks-to-cross-platform-libraries-for-react-native-14c18fa3436c?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/14c18fa3436c</guid>
            <category><![CDATA[open-source]]></category>
            <category><![CDATA[in-app-purchase]]></category>
            <category><![CDATA[cross-platform]]></category>
            <category><![CDATA[mobile-development]]></category>
            <category><![CDATA[software-architecture]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Sun, 30 Nov 2025 12:52:55 GMT</pubDate>
            <atom:updated>2025-11-30T14:46:25.782Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[3 Recent React Native Development Notes]]></title>
            <link>https://medium.com/dooboolab/3-recent-react-native-development-notes-6f92f5138759?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/6f92f5138759</guid>
            <category><![CDATA[react-native-tutorial]]></category>
            <category><![CDATA[react-native]]></category>
            <category><![CDATA[react-native-developers]]></category>
            <category><![CDATA[react-native-development]]></category>
            <category><![CDATA[expo]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Sat, 10 Aug 2024 11:21:39 GMT</pubDate>
            <atom:updated>2024-08-10T11:21:39.387Z</atom:updated>
            <content:encoded><![CDATA[<h4>Weekly Development, 2nd Week of August 2024</h4><p>I’ve often missed writing consistently due to a busy schedule, so I’ve decided to start a weekly series titled “Weekly Development Activity Briefing.” In this series, I plan to address the challenges faced during development each week.</p><p>Currently, my primary focus is on React Native development, so most of the content will revolve around that. For this week, I’ll briefly cover three key issues.</p><h3>First, let’s address the issue related to the KeyExtractor in FlashList.</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*yerm049DtHcm45bz8KYssw.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/604/1*S8LVmvb8dtjDgI-xWvwsTg.png" /><figcaption><a href="https://github.com/Shopify/flash-list/issues/730">https://github.com/Shopify/flash-list/issues/730</a></figcaption></figure><p>The issue occurs intermittently when using FlashList and seems to resurface with each new development cycle.</p><p>To briefly explain the issue: when using FlashList, if you scroll quickly through a list that has pagination applied, you may encounter a warning about a possible stableId collision.</p><pre>&lt;FlashList<br>  data={data}<br>  keyExtractor={(item: any, i: number) =&gt; `${i}-${item.title}`}<br>  renderItem={({item}) =&gt; (<br>    &lt;View<br>      style={css`<br>        padding: 16px;<br>      `}<br>    &gt;<br>      &lt;Typography.Body2&gt;{item.title}&lt;/Typography.Body2&gt;<br>    &lt;/View&gt;<br>  )}<br>  estimatedItemSize={10}<br>/&gt;</pre><p>The issue can be easily resolved by using useCallback.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*hHyJDa_00RH_s96qt4bQLA.png" /><figcaption><a href="https://github.com/Shopify/flash-list/issues/730#issuecomment-1741385659">https://github.com/Shopify/flash-list/issues/730#issuecomment-1741385659</a></figcaption></figure><p>This issue appears to be a typical concurrency problem in React, so I addressed it as follows:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*G7ujt-B2Q_kzBykBiV4iOQ.png" /><figcaption><a href="https://github.com/Shopify/flash-list/issues/730#issuecomment-2277445687">https://github.com/Shopify/flash-list/issues/730#issuecomment-2277445687</a></figcaption></figure><h4>Secondly, there is an issue related to the Android debug keystore.</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ey45mVz3aFeOtIfwjTEAeg.png" /><figcaption><a href="https://stackoverflow.com/questions/54417232/react-native-google-signin-gives-developer-error">https://stackoverflow.com/questions/54417232/react-native-google-signin-gives-developer-error</a></figcaption></figure><p>When implementing features like Google login in React Native, it’s crucial to thoroughly test in a debug environment before moving to production. Typically, developers with Android experience know that on macOS, the debug keystore can be found in the ~/.android directory. I’ve repeatedly made the same mistake of overlooking this, often encountering “Developer Error” as a result. Each time, I had to rely on recalling past experiences to resolve the issue.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/482/1*mrk-HhKHpUZ0VsPlw4FpqQ.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/716/1*agaAREvqGDD_iyaMGTs6zA.png" /></figure><p>Typically, when you create an Android project in React Native, a debug.keystore is generated within the app directory. You need to extract the SHA-1 value from this keystore and add it to your social provider&#39;s dashboard to ensure that social logins and other features work correctly in the testing environment.</p><h4>Thirdly, there is an issue related to the new architecture on Android.</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*WEf50c0ZCjL0aap7riM0QA.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/494/1*sl2wu5D7lEG4VvdSo9uLLw.png" /></figure><p>After applying the <a href="https://reactnative.dev/docs/the-new-architecture/landing-page">new architecture</a> and adding a <a href="https://reactnative.dev/docs/pressable">Pressable</a> button to the headerLeft in <a href="https://docs.expo.dev/router/introduction/">expo-router</a> to navigate back, I found that it didn’t work as expected. Upon investigating, I discovered that this issue is related to <a href="https://github.com/software-mansion/react-native-screens">react-native-screens</a>. Interestingly, while onPress doesn’t work, pressIn does trigger the action. Additionally, this issue only occurs on physical devices; the button works fine on the emulator.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*l7J9-eHwSamc38_O0p6D4w.png" /><figcaption><a href="https://github.com/software-mansion/react-native-screens/issues/1975#issuecomment-1819084348">https://github.com/software-mansion/react-native-screens/issues/1975#issuecomment-1819084348</a></figcaption></figure><p>For now, it seems that using <a href="https://docs.swmansion.com/react-native-gesture-handler/docs/components/buttons/#rectbutton">RectButton</a> from <a href="https://docs.swmansion.com/react-native-gesture-handler/">react-native-gesture-handler</a> can resolve the issue. If you found this helpful, I’ll continue to post these weekly briefing series. If you prefer to read the articles in Korean, please subscribe to the <a href="https://www.youtube.com/watch?v=Gkr9rvKdl1Y">CrossPlatform-Korea YouTube</a> channel.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6f92f5138759" width="1" height="1" alt=""><hr><p><a href="https://medium.com/dooboolab/3-recent-react-native-development-notes-6f92f5138759">3 Recent React Native Development Notes</a> was originally published in <a href="https://medium.com/dooboolab">Hyo Dev</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[리액트네이티브 입문 3 — 스타일]]></title>
            <link>https://hyodotdev.medium.com/%EB%A6%AC%EC%95%A1%ED%8A%B8%EB%84%A4%EC%9D%B4%ED%8B%B0%EB%B8%8C-%EC%9E%85%EB%AC%B8-3-%EC%8A%A4%ED%83%80%EC%9D%BC-fadc96b79b94?source=rss-6de083ee3256------2</link>
            <guid isPermaLink="false">https://medium.com/p/fadc96b79b94</guid>
            <category><![CDATA[emotions]]></category>
            <category><![CDATA[리액트네이티브]]></category>
            <category><![CDATA[초보개발]]></category>
            <category><![CDATA[초보개발자]]></category>
            <category><![CDATA[css]]></category>
            <dc:creator><![CDATA[Hyo]]></dc:creator>
            <pubDate>Fri, 16 Feb 2024 14:17:31 GMT</pubDate>
            <atom:updated>2024-02-16T14:17:31.133Z</atom:updated>
            <content:encoded><![CDATA[<h3>리액트네이티브 입문 3 — 스타일링</h3><blockquote>해당 글은 <a href="https://medium.com/crossplatformkorea/%EC%95%B1-%EA%B0%9C%EB%B0%9C-%EC%9E%85%EB%AC%B8%EC%9A%A9-%EB%AC%B8%ED%97%8C-e676bfb2f998">앱 개발 입문자를 위한 안내 문서</a>를 먼저 참고 하시고 보시는 것을 추천드립니다. 여기서는 리액트네이티브를 공부하기에 앞서 필요한 기초 지식을 다룹니다. 이번 편은 <strong>리액트</strong>입니다. 이는 개발 시작에 앞서 최소한의 내용만 숙지시키는 것이 목적입니다.</blockquote><p>리액트네이티브의 장점 중 하나는 UI를 만들기 위해 웹에서 사용되는 CSS 문법을 사용할 수 있다는 것입니다. 이는 프레임워크마다 고수하는 다양한 UI를 만드는 기술을 습득하지 않아도 광범위하게 CSS 기술을 응용할 수 있게 해주어 개발자들에게 굉장히 큰 이점을 가져다주었습니다.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*c1gGqMNxOy2QMu3b" /></figure><p>CSS(Cascading Style Sheets)는 웹 페이지의 디자인과 레이아웃을 정의하는 스타일 시트 언어입니다. 리액트네이티브에서는 CSS와 유사한 스타일링 방식을 사용하지만, 정확히 같은 문법은 아니며 ‘스타일시트’라는 이름의 JavaScript 객체를 사용하여 스타일을 정의합니다. 이 방식은 웹 개발에서의 CSS와 매우 비슷하게 느껴질 수 있지만, 카멜 케이스(camelCase)를 사용하여 속성 이름을 지정하고, 값은 대부분 문자열 대신 숫자나 객체로 표현됩니다.</p><pre>import React from &#39;react&#39;;<br>import { Text, StyleSheet, View } from &#39;react-native&#39;;<br><br>const App = () =&gt; (<br>  &lt;View style={styles.container}&gt;<br>    &lt;Text style={styles.textStyle}&gt;Hello, React Native!&lt;/Text&gt;<br>  &lt;/View&gt;<br>);<br><br>const styles = StyleSheet.create({<br>  container: {<br>    flex: 1,<br>    justifyContent: &#39;center&#39;,<br>    alignItems: &#39;center&#39;,<br>    backgroundColor: &#39;#F5FCFF&#39;,<br>  },<br>  textStyle: {<br>    color: &#39;blue&#39;,<br>    fontSize: 20,<br>  },<br>});<br><br>export default App;</pre><p>위 코드에서 StyleSheet.create 메서드를 사용하여 container와 textStyle이라는 두 개의 스타일 객체를 정의하고 있습니다. container 스타일은 뷰의 배경색, 정렬 방식 등을 지정하며, textStyle 스타일은 텍스트의 색상과 글꼴 크기를 정의합니다. 이러한 방식으로 리액트네이티브에서도 CSS와 유사한 스타일링을 적용할 수 있습니다.</p><p>더 나아가 우리는 <a href="https://emotion.sh/docs/introduction">Emotion</a> 라이브러리를 <a href="https://reactnative.dev/">리액트네이티브</a>에서 사용하고 있어 이를 사용할 것을 권장하고 있습니다.Emotion CSS 라이브러리는 스타일링을 위한 강력하고 유연한 도구로, 리액트네이티브 개발에도 적용할 수 있습니다. Emotion을 사용하면 JavaScript와 CSS의 강점을 결합하여 컴포넌트 스타일을 쉽게 정의하고 관리할 수 있습니다.</p><p>리액트네이티브에서 Emotion을 사용하는 기본 예시는 다음과 같습니다.</p><pre>import React from &#39;react&#39;;<br>import styled from &#39;@emotion/native&#39;; // Emotion 라이브러리 임포트<br><br>// Emotion을 사용한 스타일링<br>const Container = styled.View`<br>  flex: 1;<br>  justify-content: center;<br>  align-items: center;<br>  background-color: #F5FCFF;<br>`;<br><br>const StyledText = styled.Text`<br>  color: blue;<br>  font-size: 20px;<br>`;<br><br>const App = () =&gt; (<br>  &lt;Container&gt;<br>    &lt;StyledText&gt;Hello, React Native with Emotion!&lt;/StyledText&gt;<br>  &lt;/Container&gt;<br>);<br><br>export default App;</pre><p>리액트네이티브에서 스타일링을 위해 CSS 문법을 익히는 것은 중요하며, 특히 flex를 이해하는 것이 핵심입니다. 리액트네이티브의 레이아웃 시스템은 웹의 CSS Flexbox와 매우 유사하게 작동하며, 이는 모바일 앱 개발에서 유연한 레이아웃과 디자인을 구현하는 데 필수적입니다.</p><h3>Flex란?</h3><p>Flex는 컨테이너 내 아이템의 배치, 정렬, 그리고 분배를 위한 효율적인 방법을 제공합니다. 리액트네이티브에서 flex 속성을 사용하면 컨테이너 내에서 자식 요소들이 어떻게 배치되고 차지하는 공간을 어떻게 분배할지 제어할 수 있습니다.</p><h3>기본 Flex 속성</h3><ul><li>flex: 요소가 차지할 공간의 비율을 결정합니다. 더 높은 숫자는 더 많은 공간을 차지하게 됩니다.</li><li>flexDirection: 주축(main axis)의 방향을 결정합니다. row (수평), column (수직) 중 선택할 수 있습니다.</li><li>justifyContent: 주축(main axis)에 따라 자식 요소들을 어떻게 정렬할지 결정합니다. 예를 들어 flex-start, center, flex-end, space-between, space-around 등이 있습니다.</li><li>alignItems: 교차 축(cross axis)에 따라 자식 요소들을 어떻게 정렬할지 결정합니다. 예를 들어 flex-start, center, flex-end, stretch 등이 있습니다.</li><li>alignSelf: 개별 아이템이 교차 축(cross axis)에 따라 어떻게 정렬될지 결정합니다. auto, flex-start, center, flex-end, stretch 등을 사용할 수 있습니다.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*JBERd2jYiu3SpNfiZ6UrxA.png" /><figcaption><a href="https://www.youtube.com/watch?v=07ioPW-VKd4&amp;t=737s">https://www.youtube.com/watch?v=07ioPW-VKd4&amp;t=737s</a></figcaption></figure><p>관련하여 과거에 찍은 <a href="https://www.youtube.com/watch?v=07ioPW-VKd4&amp;t=737s">Flex 영상</a>을 추천드립니다. 시청해 주시고 Flex에 대한 이해를 하고 레이아웃 배치 및 CSS에 대해 익숙해지시는 시간이 되시길 바랍니다.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=fadc96b79b94" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>