<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Noraina Nordin on Medium]]></title>
        <description><![CDATA[Stories by Noraina Nordin on Medium]]></description>
        <link>https://medium.com/@aina.learns22?source=rss-79af36e6978c------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Tue, 07 Apr 2026 03:47:12 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@aina.learns22/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Build an AI Overview Rank Tracker using Python]]></title>
            <link>https://medium.com/@aina.learns22/build-an-ai-overview-rank-tracker-using-python-b3840cc352bc?source=rss-79af36e6978c------2</link>
            <guid isPermaLink="false">https://medium.com/p/b3840cc352bc</guid>
            <dc:creator><![CDATA[Noraina Nordin]]></dc:creator>
            <pubDate>Wed, 04 Feb 2026 00:28:26 GMT</pubDate>
            <atom:updated>2026-02-04T00:28:26.309Z</atom:updated>
            <content:encoded><![CDATA[<p>Search engines are evolving rapidly. Instead of presenting only a list of websites, Google now frequently displays AI Overviews. It is an AI-generated summary that combines information from multiple sources and presents answers directly at the top of the search results page.</p><p>As a result, visibility is no longer defined solely by rankings. This shift has introduced a new concept known as Generative Engine Optimization (GEO).</p><p>In this tutorial, we’ll walk through how to build a simple AI Overview Rank Tracker using Python.</p><h3>What Is an AI Overview Rank Tracker?</h3><p>An AI Overview Rank Tracker is a tool that answers a new kind of visibility question:</p><blockquote>“Is my website cited inside Google’s AI Overview, and how prominent is that citation?”</blockquote><p>Traditional rank trackers focus on organic positions (1–10).<br>An AI Overview Rank Tracker focuses on:</p><ul><li>Whether an AI Overview appears for a query</li><li>Which sources does the AI references</li><li>The order in which those sources appear</li><li>Whether your brand or competitors are included</li></ul><p>This is a core component of Generative Engine Optimization (GEO).</p><h3>Why AI Overviews Change How Ranking Works</h3><p>AI Overviews often appear above organic results and summarize information directly on the search results page. In many cases, users get their answers without clicking any links.</p><p>This introduces several challenges:</p><ul><li>Traditional rankings become less visible</li><li>Click-through rates are harder to attribute</li><li>Competition shifts toward citations, not just rankings</li><li>Existing SEO tools offer little insight into AI-generated answers</li></ul><p>As AI Overviews become more common, tracking AI citation presence becomes essential.</p><p>Read more details explanation about Rank Tracking in the Age of AI Overviews from our previous post below:</p><p><a href="https://serpapi.com/blog/rank-tracking-in-the-age-of-ai-overviews-whats-changed/">Rank Tracking in the Age of AI Overviews: What’s Changed</a></p><h3>Why Traditional Scraping Is Not Enough</h3><p>Tracking AI Overviews via browser automation or raw HTML scraping is unreliable due to:</p><ul><li>JavaScript-rendered content</li><li>Frequent layout changes</li><li>Regional and language variations</li><li>Bot detection and captchas</li><li>Fragile DOM-based selectors</li></ul><p>These issues make traditional scraping brittle and difficult to maintain at scale.</p><h3>Using SerpApi to Build an AI Overview Rank Tracker</h3><p>SerpApi provides real-time access to structured search engine data, including AI-generated elements where available. It handles:</p><ul><li>IP rotation and request routing</li><li>JavaScript rendering</li><li>Layout changes across regions</li><li>High-volume request reliability</li></ul><p>By using a<a href="https://serpapi.com/"> Web Search API</a>, developers can focus on analysis and insights rather than infrastructure maintenance.</p><p>Check out the tutorial below on how to get results from AI Overviews using our API:</p><p><a href="https://serpapi.com/blog/scrape-google-ai-overviews/">How to Scrape Google AI Overviews (AIO)</a></p><p>Prefer watching instead of reading? You can also follow along with the video walkthrough below:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FTa58WTvA5qg%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DTa58WTvA5qg&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FTa58WTvA5qg%2Fhqdefault.jpg&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/613d5335ae9183e213c705d6208058b4/href">https://medium.com/media/613d5335ae9183e213c705d6208058b4/href</a></iframe><h3>Building an AI Overview Rank Tracker using Python</h3><p>Below is a step-by-step implementation that:</p><ul><li>Detects AI Overviews</li><li>Fetches AI Overview details</li><li>Extracts cited sources</li><li>Prints title and link only</li><li>Determines whether a specific website is cited</li><li>Reports the site’s rank inside the AI Overview</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*K7IJiX1_S6Rg_5vI.gif" /></figure><h3>Step 1: Install dependencies</h3><pre>pip install google-search-results python-dotenv</pre><h3>Step 2: Run a Google search and detect AI Overview presence</h3><p>AI Overview does not appear for every query.</p><p>The first step is to detect whether one exists and retrieve its page_token.</p><pre>from serpapi import GoogleSearch<br>import os<br><br>SERPAPI_API_KEY = os.getenv(&quot;SERPAPI_API_KEY&quot;)<br>def detect_ai_overview(query):<br>    params = {<br>        &quot;engine&quot;: &quot;google&quot;,<br>        &quot;q&quot;: query,<br>        &quot;hl&quot;: &quot;en&quot;,<br>        &quot;gl&quot;: &quot;us&quot;,<br>        &quot;api_key&quot;: SERPAPI_API_KEY<br>    }<br>    search = GoogleSearch(params)<br>    results = search.get_dict()<br>    ai_overview = results.get(&quot;ai_overview&quot;)<br>    if not ai_overview:<br>        return None<br>    return ai_overview.get(&quot;page_token&quot;)</pre><p>The query here refers to target keyword or search phrase you want to monitor for AI Overview visibility.<br>If no page_token is returned, the query does not trigger an AI Overview.</p><h3>Step 3: Fetch the AI Overview result page</h3><p>Once a page token is available, use the google_ai_overview engine to retrieve full AI Overview data.</p><pre>def fetch_ai_overview(page_token):<br>    params = {<br>        &quot;engine&quot;: &quot;google_ai_overview&quot;,<br>        &quot;page_token&quot;: page_token,<br>        &quot;api_key&quot;: SERPAPI_API_KEY<br>    }<br><br>    search = GoogleSearch(params)<br>    return search.get_dict()</pre><p>This response includes structured AI content and cited references.<br>Note: Since this is another API endpoint, it uses another one search credit.</p><h3>Step 4: Extract citations and calculate rank</h3><p>In AI Overviews, ranking is determined by the order in which references are returned by the API.</p><pre>from urllib.parse import urlparse<br><br>def normalize_domain(url):<br>    return urlparse(url).netloc.replace(&quot;www.&quot;, &quot;&quot;)<br>def analyze_ai_overview(ai_results, target_url):<br>    references = ai_results.get(&quot;ai_overview&quot;, {}).get(&quot;references&quot;, [])<br>    target_domain = normalize_domain(target_url)<br>    found_rank = None<br>    print(&quot;\nAI Overview References:\n&quot;)<br>    for rank, ref in enumerate(references, start=1):<br>        title = ref.get(&quot;title&quot;)<br>        link = ref.get(&quot;link&quot;)<br>        print(f&quot;{rank}. {title}&quot;)<br>        print(f&quot;   {link}\n&quot;)<br>        if target_domain in normalize_domain(link):<br>            found_rank = rank<br>    if found_rank:<br>        print(f&quot;Your site appears in the AI Overview at position #{found_rank}&quot;)<br>    else:<br>        print(&quot;Your site is not referenced in the AI Overview&quot;)</pre><h3>Step 5: Run the full AI Overview Rank Tracker</h3><pre>if __name__ == &quot;__main__&quot;:<br>    query = input(&quot;Enter target keyword to track: &quot;).strip()<br>    website = input(&quot;Enter your website URL: &quot;).strip()<br><br>page_token = detect_ai_overview(query)<br>    if not page_token:<br>        print(&quot;No AI Overview detected for this query.&quot;)<br>    else:<br>        ai_results = fetch_ai_overview(page_token)<br>        analyze_ai_overview(ai_results, website)</pre><h4>Result</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*n1h2QOnIkBt_qhLX.png" /></figure><p>Full code is available in our GitHub <a href="https://github.com/serpapi/tutorials/tree/master/python_projects/ai_overview_rank_tracker">serpapi/tutorials</a> repository.</p><h3>Conclusion</h3><p>AI Overviews are redefining search visibility. As generative search experiences continue to expand, being cited by AI systems becomes as important as ranking organically.</p><p>An AI Overview Rank Tracker allows developers and SEO teams to measure this new form of visibility using structured, reliable data. By leveraging <a href="https://serpapi.com/search-api">Google Search API</a> and <a href="https://serpapi.com/ai-overview">Google AI Overview API</a>, we can move beyond fragile scraping and build scalable GEO monitoring systems for the future of search.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b3840cc352bc" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How AI Can Predict the Success of Your Business Using Data from Google Maps]]></title>
            <link>https://medium.com/@aina.learns22/how-ai-can-predict-the-success-of-your-business-using-data-from-google-maps-77be913dd2ee?source=rss-79af36e6978c------2</link>
            <guid isPermaLink="false">https://medium.com/p/77be913dd2ee</guid>
            <category><![CDATA[web-scraping-tools]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[ai-agent]]></category>
            <category><![CDATA[web-scraping]]></category>
            <dc:creator><![CDATA[Noraina Nordin]]></dc:creator>
            <pubDate>Wed, 17 Sep 2025 10:56:38 GMT</pubDate>
            <atom:updated>2025-09-17T10:56:38.662Z</atom:updated>
            <content:encoded><![CDATA[<p>Coffee shops are booming in my area. Every month, I hear about a new one opening, which makes me wonder:</p><blockquote><em>Will these business last long? Or are they just part of a seasonal hype?</em></blockquote><p>This curiosity inspired me to create an AI agent that can explore the coffee shop market and predict its potential success. While I’m focusing on coffee shops in this project, the same framework can be applied to other businesses, such as restaurants, salons, gyms, and retail stores. However, because LLMs are limited by their knowledge cutoff dates, having access to real-time data is crucial for obtaining the most up-to-date insights.</p><p>Traditionally, entrepreneurs relied on guesswork, small surveys, or costly market research. Now, thanks to tools like Google Maps and Google Reviews, we can access real-time information to guide better decisions.</p><p>In this project, we will use these two API solutions from SerpApi to help us gather data easily:</p><ol><li><a href="https://serpapi.com/google-maps-api">Google Maps API</a>: To scrape business listings and locations.</li><li><a href="https://serpapi.com/google-maps-reviews-api">Google Maps Reviews API</a>: To extract detailed customer reviews and ratings.</li></ol><p>Check out the <a href="https://serpapi.com/blog/scrape-google-maps-data-and-reviews-using-python/">Scrape Google Maps data and reviews using Python</a>, a comprehensive Getting Started Tutorial.</p><p><em>Please keep in mind that this AI agent is intended for educational purposes only. While it provides useful insights, predictions may not always be entirely accurate. To get the best results, it’s a good idea to combine AI insights with your own careful market research.</em></p><h3>How the AI Agent Works</h3><ol><li><strong>Search for coffee shops using SerpApi’s Google Maps API <br></strong>- Input either a location (e.g., “Austin, TX”) or GPS coordinates.<br>- Scrape all the local coffee shops with metadata.</li><li><strong>Pull reviews with SerpApi’s Google Maps Reviews API<br>- </strong>Extract customer opinions, ratings, and timestamps for each competitor.</li><li><strong>AI-Powered Analysis<br>- </strong>Summarize competitor density and pricing tiers.<br>- Perform sentiment analysis on customer reviews.<br>- Predict the success probability of a new coffee shop in the area.<br>- Output structured tables and an action plan in a clean Markdown file.</li></ol><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*pwxSuuOJ9bYXwk9T.png" /><figcaption>Workflow</figcaption></figure><h3>Using SerpApi to scrape data from Google Maps</h3><p>SerpApi’s Google Maps API allows us to gather organized data from Google Maps quickly and easily, without the hassle of CAPTCHA, proxies, or manual scraping stress.</p><p>With just one API request, you can access a wealth of information. Feel free to explore our <a href="https://serpapi.com/playground?engine=google_maps"><strong>Google Maps API playground</strong></a> to see what data you can retrieve. Below, there’s a sample screenshot from our playground to give you an idea.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*yjdSinBpPvT7nWWm.png" /><figcaption>Google Maps API playground</figcaption></figure><p>However, this is the data we need:</p><ul><li>title</li><li>data_id (We need this information to be able to retrieve the reviews later)</li><li>gps_coordinates</li><li>address</li><li>rating</li><li>reviews</li><li>price</li><li>operating_hours</li></ul><p>All data will be passed to the AI model for analysis, except for the data_id, which is required to scrape Google Maps Reviews.</p><p>Beyond ratings, reviews tell the real story. Using SerpApi’s Google Maps Reviews API, we can pull recent customer feedback for each competitor. You can visit our <a href="https://serpapi.com/playground?engine=google_maps_reviews">Google Maps Reviews API playground</a> to explore the additional information we can collect.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*eSbGo_xTJH0dejvG.png" /><figcaption>Google Maps Reviews API playground</figcaption></figure><p>The data that we’re interested in collecting:</p><ul><li>snippet which is the customer&#39;s review</li><li>rating</li><li>date</li></ul><h3>Set up the scraper</h3><h3>Prerequisites</h3><ul><li>Python 3.9+</li><li>Install dependencies:<br>pip install google-search-results openai python-dotenv requests</li><li><a href="https://serpapi.com/manage-api-key">SerpApi API keys</a> (sign up <a href="https://serpapi.com/users/sign_up">here</a> to get 250 free searches/month).</li><li>OpenAI API key or download the <a href="https://ollama.com/library/gpt-oss">gpt-oss</a> model locally via <a href="https://ollama.com/download">ollama</a>.</li><li>Create a .env file with your API keys.</li></ul><pre>SERPAPI_API_KEY=your_serpapi_key_here<br>OPENAI_API_KEY=your_openai_key_here # If you&#39;re using OpenAI</pre><h3>Import the necessary libraries</h3><p>Before we begin, let’s import the necessary libraries for making API calls, logging, managing environment variables, and handling dates and times.</p><pre>import os<br>import json<br>import time<br>import logging<br>from datetime import datetime<br>from dotenv import load_dotenv<br>from serpapi import GoogleSearch<br>from openai import OpenAI<br>import requests</pre><p>For our convenience, we will create a log to record debug information. It will be saved in the debug.log file.</p><pre>logger = logging.getLogger(&quot;coffee_agent&quot;)<br>logger.setLevel(logging.DEBUG)<br>file_handler = logging.FileHandler(&#39;debug.log&#39;)<br>file_handler.setFormatter(logging.Formatter(&#39;%(asctime)s %(levelname)s: %(message)s&#39;, datefmt=&#39;%Y-%m-%d %H:%M:%S&#39;))<br>logger.addHandler(file_handler)<br>stream_handler = logging.StreamHandler()<br>stream_handler.setFormatter(logging.Formatter(&#39;%(asctime)s %(levelname)s: %(message)s&#39;, datefmt=&#39;%Y-%m-%d %H:%M:%S&#39;))<br>logger.addHandler(stream_handler)</pre><h3>Scrape data from Google Maps</h3><p>The first step is to query Google Maps for coffee shops around a given location or coordinates.</p><p>We define a helper function to build the search parameters:</p><pre>BUSINESS_TYPE = &quot;Coffee Shop&quot;<br><br>def get_search_params(user_input, api_key):<br>    if &quot;,&quot; in user_input and all(part.replace(&#39;.&#39;, &#39;&#39;, 1).replace(&#39;-&#39;, &#39;&#39;, 1).isdigit() for part in user_input.split(&#39;,&#39;)):<br>        logger.info(f&quot;Fetching {BUSINESS_TYPE.lower()} near coordinates: {user_input}&quot;)<br>        return {<br>            &quot;engine&quot;: &quot;google_maps&quot;,<br>            &quot;q&quot;: BUSINESS_TYPE,<br>            &quot;ll&quot;: f&quot;@{user_input},14z&quot;,<br>            &quot;api_key&quot;: api_key<br>        }<br>    else:<br>        logger.info(f&quot;Fetching {BUSINESS_TYPE.lower()} in location: {user_input}&quot;)<br>        return {<br>            &quot;engine&quot;: &quot;google_maps&quot;,<br>            &quot;q&quot;: f&quot;{BUSINESS_TYPE} in {user_input}&quot;,<br>            &quot;api_key&quot;: api_key<br>        }</pre><p>This function builds the right parameters depending on whether the user types a city name or latitude/longitude coordinates.</p><p>You can change the value for BUSINESS_TYPE based on your business of interest.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*gM6nloixkGj13MGk.png" /></figure><p>Now let’s use it to scrape nearby coffee shops:</p><pre>def fetch_shops_details(search_params):<br>    logger.info(f&quot;Sending request to SerpApi for {BUSINESS_TYPE.lower()} search...&quot;)<br>    response = requests.get(&quot;https://serpapi.com/search&quot;, params=search_params)<br>    data = response.json()<br>    logger.debug(f&quot;Received response: {json.dumps(data)[:500]}&quot;)<br>    logger.info(f&quot;Received response from SerpAPI.&quot;)<br>    local_results = data.get(&quot;local_results&quot;, [])<br>    logger.info(f&quot;Found {len(local_results)} {BUSINESS_TYPE.lower()}s around the area.&quot;)<br>    return local_results</pre><p>This part of the script sends a request to SerpApi’s Google Maps API to retrieve local business results based on the area specified by the user’s input.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*epKBsT9LfyBOn7XX.png" /></figure><p>Visit our <a href="https://serpapi.com/google-maps-api">Google Maps API documentation</a> for more information.</p><h3>Scrape data from Google Maps Reviews</h3><p>Each coffee shop result has a unique data_id, which we get from the previous API call. We can use this data_id to retrieve customer reviews.</p><pre>def fetch_reviews(data_id, shop_title):<br>    logger.info(f&quot;Fetching reviews for: {data_id} ({shop_title})&quot;)<br>    review_params = {<br>        &quot;api_key&quot;: os.getenv(&quot;SERPAPI_API_KEY&quot;),<br>        &quot;engine&quot;: &quot;google_maps_reviews&quot;,<br>        &quot;data_id&quot;: data_id,<br>        &quot;hl&quot;: &quot;en&quot;<br>    }<br>    review_search = GoogleSearch(review_params)<br>    review_results = review_search.get_dict()<br>    reviews = review_results.get(&quot;reviews&quot;, [])<br>    logger.info(f&quot;Found {len(reviews)} reviews for shop {shop_title}&quot;)<br>    time.sleep(1)<br>    return [<br>        {<br>            &quot;review_text&quot;: review.get(&quot;snippet&quot;),<br>            &quot;review_star_rating&quot;: review.get(&quot;rating&quot;),<br>            &quot;timestamp&quot;: review.get(&quot;date&quot;)<br>        }<br>        for review in reviews<br>    ]</pre><p>This script sends a request to SerpApi’s Google Maps Reviews API to scrape each customer’s reviews and returns the collected reviews to each coffee shop’s data.</p><p>In our example, 20 coffee shops were found in the area from the previous search, resulting in 20 requests for Google Maps Reviews. Check out your search history <a href="https://serpapi.com/searches">here</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*s0lcQ3PpJ19JvCY_.png" /></figure><p>Visit our <a href="https://serpapi.com/google-maps-reviews-api">Google Maps Reviews documentation</a> for detailed information.</p><h3>Build Competitor Dataset</h3><p>Finally, let’s put everything together into a string format for the AI to analyze.</p><pre>def format_competitor_data(competitors):<br>    competitor_data_str = &quot;&quot;<br>    for shop in competitors:<br>        competitor_data_str += f&quot;\n- {shop[&#39;business_name&#39;]}\n  - Address: {shop[&#39;address&#39;]}\n  - GPS: {shop[&#39;GPS_coordinates&#39;]}\n  - Star Rating: {shop[&#39;star_rating&#39;]}\n  - Review Count: {shop[&#39;review_count&#39;]}\n  - Opening Hours: {shop[&#39;opening_hours&#39;]}\n  - Price Level: {shop[&#39;price_level&#39;]}\n  - Customer Reviews:\n&quot;<br>        for review in shop[&quot;customer_reviews&quot;]:<br>            competitor_data_str += f&quot;    - \&quot;{review[&#39;review_text&#39;]}\&quot; | {review[&#39;review_star_rating&#39;]} stars | {review[&#39;timestamp&#39;]}\n&quot;<br>    return competitor_data_str</pre><h3>Set up an AI agent</h3><p>Now that we have the competitor data, let’s build an AI agent to analyze it.</p><h3>Design the AI prompt</h3><p>LLMs perform best when provided with clear, structured instructions. Here’s an example prompt.</p><pre>def build_prompt(competitor_data_str):<br><br>    return f&quot;&quot;&quot;<br>                You are **{BUSINESS_TYPE} Success Forecaster**, an AI agent that predicts the potential success of a new {BUSINESS_TYPE.lower()} in a given location.<br>                You will be fed structured data about competitor {BUSINESS_TYPE.lower()}s, including:<br>                {competitor_data_str}<br>                Your tasks are:<br>                1. **Market Landscape Analysis**<br>                - Summarize competitor density within 1–3 km.<br>                - Identify star-rating patterns, price tiers, and operating hours.<br>                - Highlight underserved niches (pricing gaps, late-night, early-morning).<br>                - Output:<br>                    - A **2-column table** with | Metric | Observations |.<br>                    - A short **bullet list of underserved slots/pricing gaps**.<br>                2. **Sentiment &amp; Customer Insight**<br>                - Perform sentiment analysis on reviews.<br>                - Extract recurring positives, negatives, and unmet needs.<br>                - Output:<br>                    <br>                    | Sentiment Category | Positive Highlights | Negative Highlights | Unmet Needs / Opportunities |<br>                    |--------------------|---------------------|---------------------|-----------------------------|<br>                    <br>                - Conclude with exactly **3 key takeaways** in bullet points.<br>                3. **Success Prediction**<br>                - Estimate a **success probability score (0–100%)** in bold.<br>                - Output:<br>                    - A **2-column table** with | Metric | Value |.<br>                    - A bullet list of the **Top 3 drivers of success/failure**.<br>                4. **Recommendations**<br>                - Output:<br>                    - A **2-column table** with | Differentiator | Rationale |.<br>                    - A **2-column Risks &amp; Mitigations table**.<br>                    - End with an **Action Plan** that:<br>                        - Is written as **one concise paragraph (1–2 sentences, max 3 lines)**.<br>                        - Starts with: **Action plan:** (exact text, bolded).<br>                        - Specifies **location, operating hours, pricing, or promotions** directly tied to the analysis.<br>                        - Avoids bullet points, timelines (&quot;next 2 weeks&quot;), or generic advice.<br>                ⚠️ Rules for consistency:<br>                - Always be clear, data-driven, and practical.<br>                - Do not give generic answers; tailor insights directly to the provided data.<br>                - Use tables where defined, and bullets only where instructed.<br>                - Keep tone business-strategic, concise, and data-driven.<br>                - Do not mix styles between sections.<br>            &quot;&quot;&quot;<br></pre><p>Notice how we use tables, bullet points, and structured outputs to make things clearer and easier to follow. You can experiment with different prompts to see what works best for you.</p><h3>Run the AI Agent</h3><pre>def main():<br>    load_dotenv()<br>    api_key = os.getenv(&quot;SERPAPI_API_KEY&quot;)<br>    while True:<br>        user_input = input(&quot;Enter your business location name or coordinates (e.g., &#39;Austin, TX&#39; or &#39;30.2957009,-98.0626221&#39;) [type &#39;q&#39; or &#39;quit&#39; to exit]: &quot;).strip()<br>        if user_input.lower() in [&quot;q&quot;, &quot;quit&quot;]:<br>            print(&quot;Exiting program.&quot;)<br>            return<br>        if not user_input:<br>            print(&quot;[ERROR] Input cannot be empty. Please enter a valid location name or coordinates.&quot;)<br>            continue<br>        if not isinstance(user_input, str):<br>            print(&quot;[ERROR] Input must be a string.&quot;)<br>            continue<br>        break<br>    global now<br>    now = lambda: datetime.now().strftime(&#39;%Y-%m-%d %H:%M:%S&#39;)<br>    search_params = get_search_params(user_input, api_key)<br>    local_results = fetch_shops_details(search_params)<br>    competitors = build_competitor_data(local_results)<br>    competitor_data_str = format_competitor_data(competitors)<br>    prompt = build_prompt(competitor_data_str)<br>    client = OpenAI(<br>        base_url=&quot;http://localhost:11434/v1&quot;,  # Local Ollama API<br>        api_key=&quot;ollama&quot;                       # Dummy key<br>    )<br>    response = client.chat.completions.create(<br>        model=&quot;gpt-oss:20b&quot;,<br>        messages=[<br>            {&quot;role&quot;: &quot;user&quot;, &quot;content&quot;: prompt}<br>        ]<br>    )<br>    ai_output = response.choices[0].message.content<br>    logger.debug(f&quot;AI response: {ai_output[:500]}&quot;)<br>    # Save AI output to markdown file with timestamp<br>    md_filename = f&quot;ai_response_{datetime.now().strftime(&#39;%Y%m%d_%H%M%S&#39;)}.md&quot;<br>    with open(md_filename, &quot;w&quot;, encoding=&quot;utf-8&quot;) as md_file:<br>        md_file.write(ai_output)<br>    print(ai_output)<br>if __name__ == &quot;__main__&quot;:<br>    main()</pre><p>When you run this script, the AI will output a market analysis, sentiment breakdown, prediction, score, and action plan in Markdown format.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*HbidJRodgbziaM5Y.png" /></figure><p>The full source code is available in our <a href="https://github.com/serpapi/tutorials/tree/master/python_projects/business-success-predictor">GitHub repository</a></p><h3>What’s next?</h3><p>While this prototype focuses on coffee shops, the same approach could easily apply to other businesses as well, such as:</p><ul><li>Restaurants</li><li>Gym</li><li>Salons</li><li>Co-working spaces</li></ul><p>Some possible improvements that you can make:</p><ul><li>Add charts &amp; visualizations (e.g., competitor density heatmaps)</li><li>Deploy as a Flask/FastAPI web app so anyone can test it in their city.</li><li>Turn into a dashboard tool for entrepreneurs scouting opportunities.</li></ul><h3>Conclusion</h3><p>By combining real-time Google Maps data with AI analysis, we can quickly gain insights into competition, customer sentiment, and market gaps.</p><p>Of course, no AI can perfectly predict business success. However, tools like this can provide entrepreneurs with a faster and more data-driven way to explore opportunities.</p><p>⚠️ <strong>Disclaimer</strong>: This project is for educational and experimental purposes only. Predictions may be inaccurate. Always validate insights with proper market research before making real business decisions.</p><p>Note:</p><p>If you’d like to explore the <strong>Google Maps API</strong> and <strong>Google Reviews API</strong> for other use cases, be sure to check out our in-depth tutorial below:<br><a href="https://serpapi.com/blog/how-to-make-a-travel-guide-using-serp-data-and-python/">How To Make a Travel Guide Using SERP data and Python</a><br><a href="https://serpapi.com/blog/gauge-business-popularities-using-google-maps/">Gauge Business Popularities using Google Maps</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=77be913dd2ee" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How I organize my ZOHO email label/tags automatically using make.com and chatGPT]]></title>
            <link>https://medium.com/@aina.learns22/how-i-organize-my-zoho-email-label-tags-automatically-using-make-com-and-chatgpt-535e18ea1a30?source=rss-79af36e6978c------2</link>
            <guid isPermaLink="false">https://medium.com/p/535e18ea1a30</guid>
            <category><![CDATA[no-code]]></category>
            <category><![CDATA[ai-automation]]></category>
            <category><![CDATA[automation]]></category>
            <category><![CDATA[make]]></category>
            <dc:creator><![CDATA[Noraina Nordin]]></dc:creator>
            <pubDate>Tue, 13 Aug 2024 17:46:47 GMT</pubDate>
            <atom:updated>2024-09-02T15:35:24.067Z</atom:updated>
            <content:encoded><![CDATA[<p>Are you drowning in a sea of emails? Does your inbox make you anxious, causing you to procrastinate on replies? If you’re nodding your head, you’re not alone. The good news is, there’s a solution that can transform your email management strategy and boost your productivity.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*DLq6jAkwRb9Jl4Gi" /><figcaption>Photo by <a href="https://unsplash.com/@teamnocoloco?utm_source=medium&amp;utm_medium=referral">Team Nocoloco</a> on <a href="https://unsplash.com?utm_source=medium&amp;utm_medium=referral">Unsplash</a></figcaption></figure><h4>Before you get started, you need to have:</h4><ul><li>ZOHO email ‘Client ID’, ‘Client Secret’, ‘accountId’ and ‘labelId’ (read here if you’re not sure how to get them)</li><li>make.com account (<a href="https://www.make.com/en/register?pc=noraina">sign up here</a> if you don’t have the account)</li><li>OpenAI API keys</li><li>Download the make.com <a href="https://shop.beacons.ai/norainanordin/cb6cf002-bf49-40bf-a881-b6a095e851b1">blueprint here</a>.</li></ul><h3>My Automated Email Organization System</h3><p>I’m excited to share with you my step-by-step guide on how I organize my Zoho email labels automatically using Make.com and ChatGPT. This system has revolutionized my email management, and I believe it can do the same for you.</p><p>This approach inspired by Andy O’Neil who create a <a href="https://youtu.be/2qSUP92kGTg?si=ZzJ8q_nryL7WtTzZ">youtube tutorial</a> on how to automatically label incoming Google Workspace emails using ChatGPT and Make.com</p><p>Since I use Zoho email, the steps are slightly different.</p><h3>Step-by-Step Guide</h3><h4><strong>1. Connect Your Email to the Zoho Mail Module</strong></h4><p>Click on the ‘Add’ button and choose your region. Make sure to toggle ‘Show advanced settings’ to enter your Client ID and Client Secret keys.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*lDpYnR14-w1Tkpy7unfryw.png" /></figure><h4>2. Set the Interval</h4><p>Click on the ‘Clock’ icon to set the interval for checking your emails.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/541/1*LPTU0gfwAIKAGbGhCZoBxQ.png" /></figure><h4>3. Connect to the ChatGPT Module</h4><p>Click on the ‘ChatGPT’ module and create a connection. Enter your OpenAI API key here.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QSk7ijlyYTlGqqOrGo-Ipg.png" /></figure><h4>4. Modify the Message Content</h4><p>After creating the connection, ensure you modify the ‘Message Content’ section according to your needs.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/608/1*Q5sQi8LKHjdUpged6fqeFg.png" /></figure><h4>5. Configure the Tools Module</h4><p>Inside the ‘Tools’ module, change the pattern to your label name that you created, and set the output as the ‘labelId’. Click on ‘Add item’ to include more labels.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/577/1*p2S6lY1To1Voo2NdOsSpNg.png" /></figure><h4>6. Finalize the Zoho Mail Module</h4><p>In the ‘Zoho Mail’ module, choose the connection you made earlier and replace your ‘accountId’.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/562/1*Usm301-qCiIAHSrOxm40QA.png" /></figure><p>That’s it!</p><p>By following these steps, you’ll create an automated system that:</p><ul><li>Categorizes incoming emails</li><li>Filters out spam</li><li>Identifies urgent messages</li><li>Prioritizes emails that need responses</li></ul><p>The result? A cleaner inbox, reduced anxiety, and more time to focus on what truly matters in your business.</p><h3>Need Help Getting Started?</h3><p>If you’re feeling overwhelmed or unsure about implementing this system, don’t worry! I’m here to help. Book a free consultation call with me, and we can discuss how to tailor this solution to your specific needs.</p><h3>Beyond Email Management: How I Can Help Your Business Thrive</h3><p>While mastering your inbox is a great start, it’s just one piece of the productivity puzzle. Here are some other services I offer to help businesses like yours reach their full potential:</p><ol><li><strong>Custom AI Chatbot</strong>: Enhance customer interactions with intelligent chatbots tailored to your business needs.</li><li><strong>Make.com Automation</strong>: Streamline your workflows and boost productivity with custom Make.com automations.</li><li><strong>Virtual Assistant</strong>: Get comprehensive support for your business operations with our virtual assistant services.</li></ol><p>Ready to take control of your inbox and supercharge your productivity? Start with the steps above, and remember — I’m just a call away if you need personalized assistance. Visit <a href="https://norainanordin.com/">norainanordin.com</a> for more information</p><p>Let’s turn your email chaos into email clarity, together!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=535e18ea1a30" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Mastering Financial Data Scraping from TradingView with Python: A-step-by-step guide]]></title>
            <link>https://medium.com/@aina.learns22/mastering-financial-data-scraping-from-tradingview-with-python-a-step-by-step-guide-e9879b932cc2?source=rss-79af36e6978c------2</link>
            <guid isPermaLink="false">https://medium.com/p/e9879b932cc2</guid>
            <dc:creator><![CDATA[Noraina Nordin]]></dc:creator>
            <pubDate>Tue, 16 Apr 2024 06:07:42 GMT</pubDate>
            <atom:updated>2024-04-17T15:36:59.882Z</atom:updated>
            <content:encoded><![CDATA[<p>Do you spend hours scouring the web for the latest financial data to inform your investment decisions? What if there was a way to automate this tedious process and gain a competitive edge? Enter web scraping with Python!</p><p>In this step-by-step guide, I’ll show you how to extract up-to-the-minute stock data from TradingView ( <a href="https://www.tradingview.com/screener/">https://www.tradingview.com/screener/</a>), a popular financial analysis platform, using nothing but Python code. Buckle up and get ready to take your data game to new heights!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5ShYo-lM1iHey7wzbh0Fqg.png" /><figcaption><a href="https://www.tradingview.com/screener/">https://www.tradingview.com/screener/</a> page</figcaption></figure><h4>Inspecting the website</h4><p>The first step in any web scraping project is to identify the data source. For TradingView, we’ll navigate to their stock screener page at <a href="https://www.tradingview.com/screener/">https://www.tradingview.com/screener/</a>.</p><p>Right-click anywhere on the page and select “<strong>Inspect</strong>” (or press F12 on most browsers). This will open the developer tools, allowing us to peek under the hood.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/396/1*prt3seJd5ZGGhNSPzsDx0w.png" /><figcaption>right click &gt; Inspect</figcaption></figure><p>Next, head to the “<strong>Network</strong>” tab and refresh the page. This will display all the requests made by the website as it loads.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/714/1*QGtkEqrb3rHUu-z4SQvMbQ.png" /><figcaption>Network Tab after Reload the page</figcaption></figure><h4><strong>Finding the API Endpoint</strong></h4><p>Scroll through the network requests until you spot something promising — in our case is the first one</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/873/1*5Nm3AMBaVKuhxw2KYs2LsA.png" /></figure><p>If we expand the 0: Object, we can see the all information needed for the column.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/528/1*i8C2KE2bClp5sG5pFoFjCQ.png" /></figure><p><strong>Right-click</strong> on this request and select “<strong>Copy as cURL</strong>.”</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/500/1*KnQkDCa2E0CoK2iNeLmAKQ.png" /></figure><p>Now, fire up a tool like Insomnia or Postman, paste the cURL command, and hit send. You should see the same stock data appear in the response!</p><h4><strong>Generating the Python Code</strong></h4><p>Most API clients provide a handy “<strong>Generate Client Code</strong>” feature. In Insomnia, click on the <strong>dropdown</strong> next to the send button and select “<strong>Generate Client Code</strong>”.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/957/1*OwlV5PuByq9PJ2fOmVjqzQ.png" /></figure><p>Copy the generated code into a new Python file in your editor of choice. We’ll modify this code to fetch all available stock data from TradingView.</p><p>Open your code editor and paste the code.</p><p>After pasting the code, find (using ctrl + f) the “range” inside the payload and change it to 15000 as we have 14217 total data.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/583/1*lOQ70FUWZk2SJtExQFcp1w.png" /></figure><h4>Parse the response</h4><pre>data_json = response.json()<br>lists = data_json[&quot;data&quot;]</pre><p>Now we will extract all the data and store them in a list.</p><pre>data_list = []<br>for item in lists:<br>    name = item[&#39;d&#39;][0]<br>    price = item[&#39;d&#39;][6]<br>    change = item[&#39;d&#39;][12]<br>    volume = item[&#39;d&#39;][13]<br>    relative_volume = item[&#39;d&#39;][14]<br>    market_cap = item[&#39;d&#39;][15]<br>    pe = item[&#39;d&#39;][17]<br>    eps_diluted = item[&#39;d&#39;][18]<br>    eps_diluted_growth = item[&#39;d&#39;][19]<br>    dividends_yield = item[&#39;d&#39;][20]<br>    sector = item[&#39;d&#39;][22]<br><br>    data = {<br>        &#39;name&#39;: name,<br>        &#39;price&#39;: price,<br>        &#39;change&#39;: change,<br>        &#39;volume&#39;: volume,<br>        &#39;relative_volume&#39;: relative_volume,<br>        &#39;market_cap&#39;: market_cap,<br>        &#39;pe&#39;: pe,<br>        &#39;eps_diluted&#39;: eps_diluted,<br>        &#39;eps_diluted_growth&#39;: eps_diluted_growth,<br>        &#39;dividends_yield&#39;: dividends_yield,<br>        &#39;sector&#39;: sector<br>    }<br><br>    data_list.append(data)</pre><p>This code converts the JSON response into a Python dictionary, then loops through the data and extracts the fields you’re interested in (e.g., stock name, price, volume, etc.).</p><h4>Save the data to CSV file</h4><p>Finally, let’s save our freshly scraped data to a CSV file for easy analysis:</p><pre>with open(&#39;stocks_data.csv&#39;, &#39;w&#39;, newline=&#39;&#39;) as csvfile:<br>    fieldnames = [&#39;name&#39;, &#39;price&#39;, &#39;change&#39;, &#39;volume&#39;, &#39;relative_volume&#39;, &#39;market_cap&#39;,<br>                  &#39;pe&#39;, &#39;eps_diluted&#39;, &#39;eps_diluted_growth&#39;, &#39;dividends_yield&#39;, &#39;sector&#39;]<br>    writer = csv.DictWriter(csvfile, fieldnames=fieldnames)<br><br>    # Write the header row<br>    writer.writeheader()<br><br>    # Write the data rows<br>    for data in data_list:<br>        writer.writerow(data)</pre><p>And there you have it! You’ve successfully scraped financial data from TradingView using Python.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/972/1*cgUbkvzeqnkTcloRILd2HQ.png" /><figcaption>Snippet from the csv file</figcaption></figure><p>Of course, this is just the beginning. You can further analyze and visualize the data using Python’s powerful data science libraries like pandas, matplotlib, and seaborn.</p><p>But remember, with great power comes great responsibility. Always ensure you’re scraping ethically and within the website’s terms of service.</p><p>Happy scraping!</p><p><em>Check out the full code and more examples on my GitHub: </em><a href="https://github.com/ainacodes/TradingView-scraper"><em>https://github.com/ainacodes/TradingView-scraper</em></a></p><p><em>ps: Need a help for your data projects? Feel free to reach out to me at </em><strong><em>aina.workspace@gmail.com</em></strong><em> to discuss your project requirements and how I can assist you. Let’s turn your data into actionable insights together!</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=e9879b932cc2" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Leveraging JavaScript Constructors by Build a Product Adder feature for E-commerce website]]></title>
            <link>https://medium.com/@aina.learns22/leveraging-javascript-constructors-by-build-a-product-adder-feature-for-e-commerce-website-98e8120113f6?source=rss-79af36e6978c------2</link>
            <guid isPermaLink="false">https://medium.com/p/98e8120113f6</guid>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[javascript-constructor]]></category>
            <dc:creator><![CDATA[Noraina Nordin]]></dc:creator>
            <pubDate>Sun, 26 Nov 2023 16:00:59 GMT</pubDate>
            <atom:updated>2023-11-26T16:00:59.537Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*kB2khvfeBBO_iMOpU2K2Tg.png" /><figcaption>Image generated using <a href="https://playgroundai.com/">PlaygroundAI</a></figcaption></figure><p>As a self-taught (web developer-to-be) that currently learning JavaScript, I always looking for ways to make sure I understand the topic that I’m currently learning.</p><p>So in this post, I’ll show you how I used JavaScript constructor by building a product adder feature for an e-commerce site. By the end, you’ll see how constructors helped streamline working with product data.</p><p>JavaScript constructors are great ways to organize code and manage data more efficiently. They act as a blueprint for creating objects with a consistent structure.</p><h4><strong>Understanding JavaScript Constructors</strong></h4><p>Constructors are functions that create new objects. Here’s a simple Product constructor:</p><pre>function Product(title, price, description) {<br>  this.title = title;<br>  this.price = price;  <br>  this.description = description;<br>}</pre><p>Now I can create structured product objects like:</p><pre>const product1 = new Product(<br>  &quot;Blue socks&quot;, <br>  10.99,<br>  &quot;Comfortable cotton socks&quot;<br>);</pre><p>Much easier than manually assigning properties!</p><p>Now, lets see how we can use this in web application .</p><h4><strong>Building the Interface</strong></h4><p>I created a simple interface to add and display products. The final products look like this:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/951/1*j41jiHK4EAw9bYIKXnuJ3Q.png" /><figcaption>The web page</figcaption></figure><pre>&lt;!-- HTML --&gt;<br>&lt;h1&gt;Add Your Products Here&lt;/h1&gt;<br>&lt;button class=&quot;addProductBtn&quot; id=&quot;addProductBtn&quot;&gt;Add Product&lt;/button&gt;<br>&lt;div class=&quot;productForm&quot; id=&quot;productForm&quot; style=&quot;display: none;&quot;&gt;<br>  &lt;input type=&quot;text&quot; id=&quot;title&quot; placeholder=&quot;Title&quot;&gt;<br>  &lt;input type=&quot;text&quot; id=&quot;price&quot; placeholder=&quot;Price&quot;&gt;<br>  &lt;input type=&quot;text&quot; id=&quot;description&quot; placeholder=&quot;Description&quot;&gt;<br>  &lt;div class=&quot;buttonContainer&quot;&gt;<br>    &lt;button class=&quot;saveBtn&quot; id=&quot;saveBtn&quot;&gt;Save&lt;/button&gt;<br>    &lt;button class=&quot;cancelBtn&quot; id=&quot;cancelBtn&quot;&gt;Cancel&lt;/button&gt;<br>  &lt;/div&gt;<br>&lt;/div&gt;<br>&lt;div class=&quot;productList&quot; id=&quot;productList&quot;&gt;&lt;/div&gt;</pre><p>Upon clicking “Add Product”, a form dynamically appears, allowing users to input details like the product title, price, and description.</p><h4>Interacting with JavaScript Constructors</h4><p>Through JavaScript, we’ll capture user input and instantiate new Product objects utilizing the constructor.</p><pre>function addProduct() {<br>  // Get entered values<br>  const title = document.getElementById(&#39;title&#39;).value; <br>  const price = document.getElementById(&#39;price&#39;).value;<br>  const desc = document.getElementById(&#39;description&#39;).value;<br><br>  // Create Product<br>  const product = new Product(title, price, desc);<br>  <br>  // Display<br>  displayProduct(product);<br>}</pre><blockquote>The key is using the constructor to convert user input into tidy Product instances!</blockquote><p>displayProduct() then shows the new product. I append some DOM elements:</p><pre>function displayProduct(product) {<br>  const productList = document.getElementById(&#39;productList&#39;);<br>  const productCard = document.createElement(&#39;div&#39;);<br>  productCard.classList.add(&#39;productCard&#39;);<br><br>  // Creating elements for product details<br>  const title = document.createElement(&#39;h2&#39;);<br>  title.textContent = product.title;<br>  const price = document.createElement(&#39;p&#39;);<br>  price.textContent = `Price: $${product.price}`;<br>  const description = document.createElement(&#39;p&#39;);<br>  description.textContent = product.description;<br><br>  productCard.appendChild(title);<br>  productCard.appendChild(price);<br>  productCard.appendChild(description);<br><br>  productList.appendChild(productCard);<br>}</pre><p>For friendly user experience, I add resetForm() to reset the form when user wants to add another new product.</p><pre>function resetForm() {<br>  document.getElementById(&#39;title&#39;).value = &#39;&#39;;<br>  document.getElementById(&#39;price&#39;).value = &#39;&#39;;<br>  document.getElementById(&#39;description&#39;).value = &#39;&#39;;<br>}</pre><p>Add event listener for Add Product and Save buttons.</p><pre>document.getElementById(&#39;addProductBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;block&#39;;<br>  resetForm();<br>});<br><br>document.getElementById(&#39;saveBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  addProduct();<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;none&#39;;<br>});<br><br>document.getElementById(&#39;cancelBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;none&#39;;<br>});</pre><p>The <strong>final JavaScript code</strong> look like this:</p><pre>function Product(title, price, description) {<br>  this.title = title;<br>  this.price = price;  <br>  this.description = description;<br>}<br><br>function addProduct() {<br>  // Get entered values<br>  const title = document.getElementById(&#39;title&#39;).value; <br>  const price = document.getElementById(&#39;price&#39;).value;<br>  const desc = document.getElementById(&#39;description&#39;).value;<br><br>  // Create Product<br>  const product = new Product(title, price, desc);<br>  <br>  // Display<br>  displayProduct(product);<br>}<br><br>function displayProduct(product) {<br>  // Creating a new div element inside productList<br>  const productList = document.getElementById(&#39;productList&#39;);<br>  const productCard = document.createElement(&#39;div&#39;);<br>  productCard.classList.add(&#39;productCard&#39;);<br><br>  // Creating elements for product details<br>  const title = document.createElement(&#39;h2&#39;);<br>  title.textContent = product.title;<br>  const price = document.createElement(&#39;p&#39;);<br>  price.textContent = `Price: $${product.price}`;<br>  const description = document.createElement(&#39;p&#39;);<br>  description.textContent = product.description;<br><br>  productCard.appendChild(title);<br>  productCard.appendChild(price);<br>  productCard.appendChild(description);<br><br>  productList.appendChild(productCard);<br>}<br><br>function resetForm() {<br>  document.getElementById(&#39;title&#39;).value = &#39;&#39;;<br>  document.getElementById(&#39;price&#39;).value = &#39;&#39;;<br>  document.getElementById(&#39;description&#39;).value = &#39;&#39;;<br>}<br><br>document.getElementById(&#39;addProductBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;block&#39;;<br>  resetForm();<br>});<br><br>document.getElementById(&#39;saveBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  addProduct();<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;none&#39;;<br>});<br><br>document.getElementById(&#39;cancelBtn&#39;).addEventListener(&#39;click&#39;, () =&gt; {<br>  document.getElementById(&#39;productForm&#39;).style.display = &#39;none&#39;;<br>});</pre><h4>Bringing It Together</h4><p>Combining the DOM with constructors made managing products a breeze. We can now instantly transform user submissions into structured data — no messy object creation code!</p><blockquote>Constructors FTW!</blockquote><h4>Conclusion</h4><p>Constructors help organize complex systems. By standardizing object creation, they reduce redundancies and make code more adaptable.</p><p>I hope you’ve seen their potential for managing data in meaningful ways. Constructors elevate your code from messy scripts to robust programs ready for real-world systems like e-commerce.</p><p>Try them in your next web project!</p><p>Full code available in my GitHub: <a href="https://github.com/ainacodes/product-adder-page">https://github.com/ainacodes/product-adder-page</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=98e8120113f6" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>