<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Tactical Tech on Medium]]></title>
        <description><![CDATA[Stories by Tactical Tech on Medium]]></description>
        <link>https://medium.com/@Info_Activism?source=rss-f41206e28794------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Thu, 09 Apr 2026 07:12:08 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@Info_Activism/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[The Persistent Problems Of Digital Resilience]]></title>
            <link>https://medium.com/@Info_Activism/the-persistent-problems-of-digital-resilience-be5a6204f579?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/be5a6204f579</guid>
            <category><![CDATA[tactical-tech]]></category>
            <category><![CDATA[digital-resilience]]></category>
            <category><![CDATA[technology]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[tech]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Mon, 14 Apr 2025 09:52:49 GMT</pubDate>
            <atom:updated>2025-04-14T09:52:49.043Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*7Ie2BfJbhb-i8xnm.jpg" /></figure><p><a href="https://tacticaltech.org/team-and-board/marek-tuszynski/"><strong><em>Marek Tuszynski</em></strong></a><strong><em>, Executive Director and co-founder of Tactical Tech. January 2025</em></strong></p><p>In this article, I’ll share some of the key lessons we’ve learned about navigating the complex world of digital security. I’ll look at how to identify the right tools, services, resources, and organisations to protect your community, network, or organisation from cyber threats — and why this work is more important than ever. Consider this: almost everything we do online relies on the infrastructure and services of the ‘big five’ technology companies — Google, Apple, Facebook, Amazon, and Microsoft (GAFAM) + rapidly catching up with Chinese counterparts: TikTok, DeepSeek. At the same time, the regulations and policies that govern these digital spaces and their gatekeepers can be overturned overnight by shifting political agendas with the stroke of a pen, while the sophistication of surveillance and hacking tools is no match for what civil society has at its disposal. It’s a precarious environment and difficult times, and understanding how to protect against these risks is more important than ever.</p><h3>Choosing tools wisely</h3><p>We are often asked to recommend the use of specific tools such as virtual private networks (VPNs), secure messaging applications, and secure web browsing applications. However, we strongly advise against relying on tools alone when a more holistic approach is required. While tools can provide basic protection, they are not a silver bullet and will require significant changes to your organisation’s processes and culture. And these are just the tolls — any tool is only as good as the mastery of its use.</p><h3>The limitations of fixing what we have</h3><p>A common misconception is that addressing digital security is simply a matter of deploying new security tools or technologies. However, this approach often overlooks the underlying structural limitations that reduce an organisation’s ability to address long-term systemic issues. In reality, many organisations lack the necessary resources, including long-term funding and strategic partnerships, to address these deeper challenges. This is because addressing complex security issues typically requires significant investment in people, processes, and infrastructure, which can be difficult to secure within existing funding programmes and budgets.</p><h3>Everyone needs to be involved</h3><p>Effective digital security requires a collaborative effort across your organisation, network, or movement. It’s essential that everyone is involved in identifying potential threats and vulnerabilities, and in developing and adopting strategies to mitigate them. This means that no one person or team should be the sole point of contact for all security issues. This is particularly important if your organisation or network works across multiple projects, regions, or themes, each with its own specific contexts, activities, staff, audiences, etc.</p><h3>Your weaknesses determine your strengths</h3><p>Our experience also highlights the importance of understanding your organisation’s strengths and weaknesses. Just as a chain is only as strong as its weakest link, our digital security efforts will only be effective if we understand how our own vulnerabilities can be exploited by attackers. Think of it like a person-overboard (MOB) drill on a boat. It’s usually practised in nice summer conditions with a floating fender as a mock casualty — but it’s very different when you’re in a storm, someone goes overboard, hit by the swinging boom, and someone else in your crew has to radio for help but doesn’t know how, and the engine just won’t start. You may have been the most experienced skipper on the boat, but the situation got out of control in a split second because no one else was properly prepared, even if they knew some theory pretty well.</p><h3>Risk assessment is critical but difficult to achieve</h3><p>Risk assessment is an essential aspect of digital security. While it’s tempting to focus on identifying risks rather than mitigating them, this approach often leads to a one-size-fits-all solution that may not address your organisation’s specific vulnerabilities. A good risk assessment takes time, effort, and resources, but it’s ultimately a living document that helps you understand your vulnerabilities and develop effective mitigation strategies. It’s also vital that this assessment is developed with the attitude that things will definitely happen when you’re least prepared for them, rather than if they happen at all.</p><h3>Soft spots are not where you think they might be</h3><p>We see a lot of focus on the, let’s call them, frontline activities and needs — the tools used by investigative journalists, human rights defenders, environmental activists and so on. Rightly so, but there are many operations, communications and relationships with organisations and institutions that use problematic tools for their accounting, travel planning, auditing and funder reporting. These systems collect, store and process a lot of valuable information and assets that are often overlooked in security planning and risk assessments. In addition, at the end of the day it all comes down to people — and we live and work under a lot of pressure and stress, the technology we use is and will remain confusing, non-intuitive, frustrating and easy to make mistakes with.</p><h3>It used to feel like a game of cat and mouse, now it could just be a game of trap</h3><p>If you are an individual such as the <a href="https://www.theguardian.com/world/2022/may/02/spain-prime-minister-pedro-sanchez-phone-pegasus-spyware">Prime Minister of Spain Pedro Sánchez</a>, <a href="https://www.amnesty.org/en/latest/news/2024/12/serbia-authorities-using-spyware-and-cellebrite-forensic-extraction-tools-to-hack-journalists-and-activists/">Slaviša Milanov</a>, an investigative journalist from Serbia — or an organisation such as <a href="https://www.hrw.org/news/2022/01/26/human-rights-watch-among-pegasus-spyware-targets">Human Rights Watch</a>, or one of the <a href="https://www.bbc.com/news/world-57891506">tens of thousands of people</a> whose phone numbers are believed to have been targeted by customers of the NSO Group, or Cellebrite DI Ltd, companies that make tools such as Pegasus and Cellebrite, which are well-known and well-documented examples of sophisticated tools available to state-level agencies, these are often going undetected, you might only see post factum <a href="https://techcrunch.com/2021/07/19/toolkit-nso-pegasus-iphone-android/">if your phone was targetted at all</a>. It is almost impossible to protect yourself from such tools, short of going completely dark, but who can? Now add some AI to the equation… and goodbye!</p><h3>Conclusion</h3><p>In summary, effective digital security is not just about tools and technologies; it requires a holistic approach that includes cultural change, collaborative efforts, ongoing risk assessments, and awareness of hidden vulnerabilities in operational processes. By addressing these issues, non-profits, networks and movements can build more resilient systems that are better equipped to deal with unexpected challenges and threats. This text will not conclude here without giving some suggestions and pointers on how to actually choose tools if you must, where to start and where to look for advice and help, which will come in the following sections.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*MRgGcEaTwZ9AkKhf.jpg" /></figure><h3>How we choose tools</h3><p>At Tactical Tech, we advocate for a rethinking of technology — from short-term convenience to long-term — with trust at its core. Proprietary software (black boxes) demands blind trust, leaving users in the dark about data and security. In contrast, open source software (open boxes) promotes higher levels of transparency, empowering users and giving them control over their digital environments.</p><p>The tools we use and promote are guided by eight key principles:</p><ul><li>Open Source: Transparent and non-proprietary. This does not mean that you can take open source at face value — it is better that the code is open source and has a good open licence — but that does not automatically make it perfect code. Also, check the claim that tool vendors often make: it may be that the client application is open source, but the server operations are not — for us, that’s not an open source project. You can read more about this in the 2022 report. — <a href="https://internews.org/wp-content/uploads/2022/05/BASICS-report-on-health-of-open-source-digital-safety-tool-ecosystem.pdf">BASICS Report</a> on Open Source Digital Security Tool Ecosystem.</li><li>Trusted: Audited and reliable. Note that audits cost a lot of money, not every developer can afford it, there are tools out there that people trust that have not been audited.</li><li>Mature: Stable, actively supported by users and developers. This is a good indicator — look at how often the tool is updated, how the development team responds to problems, and did the ownership changed? etc.</li><li>User-Friendly: Accessible to a wide audience. There are a lot of amazing tools out there that require command line and terminal skills to run, if you can use them do so, but the rest of us are stuck with tools that have accessible graphic user interfaces (GUIs) and they need to be friendly.</li><li>Multi-Language: Localisation support. This probably goes without saying.</li><li>Multi-Platform: Compatible with Mac, Windows, Linux, and Android. Of course, we want everyone to be on Linux — but let’s be realistic.</li><li>Well-Documented: Easy to understand and use. Documentation is paramount, we rely on it and write our own — so should you.</li><li>Transparent about data practices: Is it clear what happens to data, what data is accessed, collected, and stored? are there third-party trackers? does the tool require access to services it doesn’t need? what is their data retention policy? Is it end-to-end encrypted?</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*HbikU86Nr7bgznEx.jpg" /></figure><h3>What tools we use</h3><p>This is not to say that you should use any of these tools — it is your choice — but it might be a good start to see the differences, set your own priorities and find what works for you.</p><p><strong>Content collaboration and development:</strong></p><ul><li><a href="https://nextcloud.com/">NextCloud</a></li><li><a href="https://about.gitlab.com/">GitLab</a></li></ul><p><strong>Operating Systems:</strong> We prefer Linux in various flavours</p><ul><li><a href="https://www.debian.org/">Debian</a></li><li><a href="https://linuxmint.com/">Mint</a></li><li><a href="https://fedoraproject.org/">Fedora</a></li><li><a href="https://ubuntu.com/desktop">Ubuntu</a></li></ul><p><strong>Internet Browsers:</strong></p><ul><li><a href="https://www.torproject.org/download/">Tor Browser</a></li><li><a href="https://brave.com/">Brave</a></li><li><a href="https://mullvad.net/en">Mullvad Browser</a></li><li><a href="https://www.mozilla.org/en-US/firefox/new/">Firefox</a></li><li><a href="https://github.com/ungoogled-software/ungoogled-chromium">Ungoogled-chromium</a></li></ul><p><strong>Text, Video, Audio Editing:</strong></p><ul><li><a href="https://www.libreoffice.org/">LibreOffice</a></li><li><a href="https://www.shotcut.org/">ShotCut</a></li><li><a href="https://www.audacityteam.org/">Audacity</a></li><li><a href="https://obsproject.com/">OBS Studio</a></li></ul><p><strong>Encrypted Email, Calendar, and Contacts:</strong></p><ul><li><a href="https://www.thunderbird.net/en-GB/">Thunderbird</a></li><li><a href="https://proton.me/mail">Proton mail</a></li><li><a href="https://tuta.com/">Tuta</a></li><li><a href="https://riseup.net/en/about-us">Riseup</a></li></ul><p><strong>Password managers:</strong></p><ul><li><a href="https://keepassxc.org/">KeePassXC</a></li><li><a href="https://proton.me/pass">Proton Pass</a></li></ul><p><strong>Instant Messenger:</strong></p><ul><li><a href="https://signal.org/">Signal</a></li><li><a href="https://element.io/">Element</a></li><li><a href="https://wire.com/en/">Wire</a></li></ul><p><strong>Secure File Storage:</strong></p><ul><li><a href="https://www.veracrypt.fr/code/VeraCrypt/">Veracrypt</a></li><li><a href="https://cryptomator.org/">Cryptomator</a></li></ul><p><strong>Connecting Securely to the Internet -VPNs:</strong></p><ul><li><a href="https://mullvad.net/en">Mullvad VPN</a></li><li><a href="https://protonvpn.com">Proton VPN</a></li><li><a href="https://riseup.net/en/vpn">Riseup VPN-free</a></li></ul><p><strong>Social Media and Video Sharing:</strong></p><ul><li><a href="https://mastodon.cc">Mastodon</a></li><li><a href="https://bsky.app/">Bluesky</a></li><li><a href="https://joinpeertube.org/">PeerTube</a></li></ul><p><strong>Email Newsletters:</strong></p><ul><li><a href="https://docs.mailman3.org/projects/postorius/en/latest/#">Postorius</a></li><li><a href="https://listmonk.app/">ListMonk</a></li></ul><p><strong>Online forms and surveys:</strong></p><ul><li><a href="https://www.limesurvey.org/">LimeSurvey</a></li></ul><p><strong>Communication and Virtual Meetings:</strong></p><ul><li><a href="https://bigbluebutton.org/">BigBlueButton</a></li></ul><p><strong>Web Analytics:</strong></p><ul><li><a href="https://matomo.org/">Matomo</a></li></ul><p><strong>AI:</strong></p><ul><li><a href="https://github.com/janhq/jan">Jan</a></li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*1616pYTOH1qHzx7o.jpg" /></figure><h3>Resources to check out in alphabetical order (ours is at the bottom)</h3><p>Some are more up to date than others — this is not an exhaustive list — but a good place to start — we do not endorse any of these, including our own — it is up to you to decide what would work for you — the list below should speed up the process.</p><h3><a href="https://www.accessnow.org/help/">Access Now’s Digital Security Helpline</a></h3><p>Access Now’s Digital Security Helpline works with individuals and organizations around the world to keep them safe online. If you’re at risk, we can help you improve your digital security practices to keep out of harm’s way. If you’re already under attack, we provide rapid-response emergency assistance.</p><h3><a href="https://securityplanner.consumerreports.org/">Consumer Reports</a></h3><p>Security Planner -(initially developed by Citizen Lab) — a self assessment tool for individuals to cut down on data collection and protect your sensitive personal information, health data, and geolocation.</p><h3><a href="https://www.digitaldefenders.org">Digital Defenders Partnership</a></h3><p>DDP is an international programme that contributes to strengthening the resilience of Human Rights Defenders by increasing their digital security through a holistic and sustainable approach.</p><h3><a href="https://www.holistic-protection.org/">Holistic Protection Collective</a></h3><p>Aims to improve the security of Rights-based organisations, initiatives, foundations and activist collectives. Having a holistic approach for us means mitigating and responding to digital, physical and emotional risks.</p><h3><a href="https://www.eff.org/">Electronic Frontier Foundation’s Tools and Materials</a></h3><p><a href="https://ssd.eff.org/en/playlist/human-rights-defender">Surveillance Self-Defense</a></p><p><a href="https://ssd.eff.org/en/module/choosing-vpn-thats-right-you">How to chose the right VPN</a></p><h3><a href="https://www.securityeducationcompanion.org/">Education Companion</a></h3><p>A resource for people teaching digital security to their friends and neighbours</p><h3><a href="https://ethical.net/resources/">Ethical Network</a></h3><p>Ethical Alternatives &amp; Resources — Ethical.net is a collaborative platform for curating, building and discovering ethical alternatives for tech products.</p><h3><a href="https://ethical.net/resources/">Free Press Unlimited</a></h3><p><a href="https://totem-project.org/">Totem</a> — Digital Security training for activists and journalists — an online platform that helps journalists and activists use digital security and privacy tools and tactics more effectively in their work.</p><h3><a href="https://www.frontlinedefenders.org/">Frontline Defenders</a></h3><p><a href="https://securityinabox.org/en/">Security in a Box</a> — a toolkit developed together with us — currently led by Frontline Defenders.</p><p><a href="https://www.frontlinedefenders.org/en/resource-publication/guide-secure-group-chat-and-conferencing-tools">How to chose Secure App for Conferencing</a></p><p><a href="https://www.frontlinedefenders.org/en/programme/risk-analysis-protection-training">Risk Analysis &amp; Protection Planning</a></p><h3>Global Investigative Journalism Network’s ‘Journalist Security Assessment Tool’ and training</h3><p><a href="https://advisory.gijn.org/cybersecurity-assessment/">The Journalist Security Assessment Tool</a> (JSAT) was developed for the use of GIJN partners and journalists around the world. Upon completion, the assessment tool will display a series of recommendations.</p><h3>Global Support Link</h3><p><a href="https://guide.globalsupport.link/">Help Desk</a> — provides free resources to people and organizations who are part of the Powered by the People (PxP) global network. In addition to the online resources, members of the Help Desk can answer questions and provide advice on digital and physical security via chat, messaging or email channels.</p><h3><a href="https://guardianproject.info/">Guardian Project</a></h3><p>Guardian Project creates easy-to-use secure apps, open-source software libraries, and customized solutions that can be used around the world by any person looking to protect their communications and personal data from unjust intrusion, interception and monitoring.</p><h3>Internews Digital Safety Initiative / Safetag</h3><p><a href="https://safetag.org/">Safetag</a> — is a professional audit framework that adapts traditional penetration testing and risk assessment methodologies to be relevant to smaller non-profit organizations based or operating in the developing world.</p><h3>Level-Up</h3><p><a href="https://ghpages.level-up.cc/">LevelUp</a> is a living project intended to provide support to, and enable creation of resources and sharing of knowledge within, a growing network of individuals providing needed digital safety training and education to users of technology worldwide.</p><h3>Open Briefing</h3><p><a href="https://openbriefing.gitbook.io/defenders-protocol">The Protocol</a> — The Holistic Security Protocol for Human Rights Defenders (the Defender’s Protocol) helps us advance our physical safety, digital security, and well-being and resilience. By following the Protocol, we enhance our individual and collective security, and can reduce the burden of attacks, harassment, and censorship on us and our communities.</p><h3>Rapid Response Network &amp; CiviCERT</h3><p><a href="https://digitalfirstaid.org/">The Digital First Aid Kit</a> — is a free resource to help rapid responders, digital security trainers, and tech-savvy activists to better protect themselves and the communities they support against the most common types of digital emergencies. It can also be used by activists, human rights defenders, bloggers, journalists or media activists who want to learn more about how they can protect themselves and support others.</p><h3>Reporters without Borders (RSF)</h3><p><a href="https://rsf.org/sites/default/files/guide_journaliste_rsf_2015_en_0.pdf">Safety Guide For Journalists</a> — a handbook for reporters in high-risk environments (pdf)</p><h3>Tactical Tech’s many projects</h3><p><a href="https://holistic-security.org/">Holistic Security</a> is a strategy manual to help human rights defenders maintain their well-being in action. The holistic approach integrates self-care, well-being, digital security, and information security into traditional security management practices.</p><p><a href="https://holistic-security.org/trainers-manual.html">Holistic Security Trainers manual</a> is a companion to the Strategy Manual for HRDs, drawing out key learnings from the holistic security project from a facilitator’s perspective and bringing together best practices from the fields of digital security, physical security and psycho-social well-being.</p><p><a href="https://en.gendersec.train.tacticaltech.org/">Gender Tech Resources</a> -note last time updated in 2019- is a resource that introduces a holistic, feminist perspective to privacy and digital security trainings, informed by years of working with women and trans activists around the world.</p><p><a href="https://xyz.informationactivism.org/en/">XYZ</a> — is a space for practical tools to navigate digital security and privacy from a gender perspective, learn from each other’s activism, inspire one another and co-create.</p><p>Safety First! — a <a href="https://kit.exposingtheinvisible.org/en/safety.html">guide</a> and a related <a href="https://exposingtheinvisible.org/en/workshops/safety-first-workshop/">workshop curriculum</a> to help you stay digitally, physically and psychologically safe and aware of potential risks at all times by adopting some basic good practices and tools to keep your human sources, yourself and your evidence protected.</p><p><a href="https://datadetoxkit.org">Digital Enquirer Kit</a> — the Digital Enquirer Kit is an e-learning course about media literacy, verification, and how to navigate the internet safely.</p><p><a href="https://datadetoxkit.org">Data Detox Kit</a> — a set of simple guides about Artificial Intelligence, digital privacy, security, well-being, misinformation, health data, and tech and the environment.</p><p>All the images by author from the series “Cemetery Surfaces” 2025</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=be5a6204f579" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Everywhere, All the Time A digital literacy intervention for teens that fosters critical…]]></title>
            <link>https://medium.com/@Info_Activism/everywhere-all-the-time-a-digital-literacy-intervention-for-teens-that-fosters-critical-095bc2ba45d7?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/095bc2ba45d7</guid>
            <category><![CDATA[critical-digital-literacy]]></category>
            <category><![CDATA[educators]]></category>
            <category><![CDATA[digital-literacy]]></category>
            <category><![CDATA[education-technology]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Tue, 16 Apr 2024 08:21:54 GMT</pubDate>
            <atom:updated>2024-04-16T09:48:22.036Z</atom:updated>
            <content:encoded><![CDATA[<h3>Everywhere, All the Time A digital literacy intervention for teens that fosters critical conversions about technology and AI</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*71BypJDV0SNzuUU5Ul54xg.jpeg" /></figure><p>With a growing generation of teens relying on the internet for learning, entertainment, and socialising, it’s crucial to cultivate their capacity to ask critical questions about how technology impacts their lives, their communities, and the planet. But how can educators guide teens to understand and navigate the digital world confidently? <a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>is a creative and playful digital literacy intervention that aims to do just that.</p><p>Educators working with young people will find <a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>helpful in fostering critical conversations about technology, AI and their impacts. Developed by Tactical Tech’s youth initiative <strong>What the Future Wants</strong>, this learning experience equips educators, librarians and anyone working with youth aged 13–19 with a set of engaging and innovative digital literacy resources and methods that can be used in both formal and informal learning environments.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1007/0*rblUJpsC8k_U17y0" /><figcaption>Example of the posters included in the “Everywhere, All The Time” digital literacy intervention. Design by La Loma</figcaption></figure><p>Co-developed with 300 young people and 100 educators worldwide, <a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>includes an easy-to-print and install exhibition, activity cards and an extensive educators’ guidebook. These resources assist facilitators in guiding teens to explore crucial topics, including understanding how AI chatbots work and the hidden labour behind technology, recognising the attention-grabbing designs behind popular games, and learning about the physical infrastructures that power the internet, among many other topics.</p><p><a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>uses practical examples, simple and familiar language, and engaging visuals, making it an accessible interactive experience that resonates with youth. By utilising these resources, educators can create a unique space where teens can reflect on their relationship to technology while examining how tech and invisible systems like algorithms shape their lives and relationships.</p><p>Teens can also increase their confidence by actively adopting better digital privacy and well-being habits. <a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>serves as a learning platform for meaningful exploration and discussion and a place for inspiration that encourages teens to take action and shape the digital world they want to live in.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/579/0*nvrcfqlK8NJjHdkB" /><figcaption>Cover of the “Everywhere, All The Time” Educators Guidebook. Design by La Loma</figcaption></figure><p><a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>“Everywhere, All the Time”</strong></a><strong> </strong>is available in English. It will soon be available in Czech, Cypriot Greek, Italian, Polish, Romanian, Slovak, Spanish, Ukrainian and more languages.</p><p>“Everywhere, All the Time” is produced by Tactical Tech in collaboration with <a href="http://www.eun.org/professional-development/academy">European Schoolnet</a>, <a href="https://www.ifla.org/">International Federation of Library Associations (IFLA)</a>, <a href="https://www.savethechildren.it/">Save the Children (Italy) </a>and generously co-funded by the European Union.</p><p><strong>Local and national partners working with youth also supported the production of this intervention: </strong><a href="http://www.verificat.cat/">Associació Verificat — Spain</a>, <a href="https://www.casahacker.org/">Casa Hacker — Brazil</a>, <a href="https://www.kp.sik.si/">Central Library of Srečko Vilhar Koper — Slovenia</a>, <a href="https://www.racunalniski-muzej.si/en/home-english/">Computer History Museum — Slovenia</a>, <a href="https://www.jugendstiftung.de/">German Youth Foundation Baden Württemberg — Germany</a>, <a href="https://www.goethe.de/ins/mk/de/index.html">Goethe Institut Skopje — North Macedonia</a>, <a href="https://laboratorium.ba/">Laboratorium Tuzla — Bosnia and Herzegovina</a>, <a href="https://www.uroboros.design/">Plataforma Uroboros — Czech Republic</a>, <a href="http://knjiznice.nsk.hr/prelog/">Prelog Town Library — Croatia</a>, <a href="https://knihovnatrinec.cz/">Trinec Town Library — Czech Republic</a>, <a href="https://www.webwise.ie/">Webwise — Ireland</a>, <a href="https://twitter.com/flesymz">Yusuf Ganyana — Kenya</a>.</p><p>Find out more at <a href="https://theglassroom.org/youth/everywhere-all-the-time/"><strong>https://theglassroom.org/youth/everywhere-all-the-time/</strong></a></p><p><strong>How do I get access to these resources?</strong> We would be thrilled for you to use this new resource pack for educators. If you are interested in getting a hold of the materials, please contact <a href="mailto:youth@tacticaltech.org">youth@tacticaltech.org</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=095bc2ba45d7" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Inform: A voter literacy resource center for navigating online political influence and digital…]]></title>
            <link>https://medium.com/@Info_Activism/inform-a-voter-literacy-resource-center-for-navigating-online-political-influence-and-digital-b3412a03dc10?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/b3412a03dc10</guid>
            <category><![CDATA[elections]]></category>
            <category><![CDATA[democracy]]></category>
            <category><![CDATA[political-influence]]></category>
            <category><![CDATA[voter-education]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Thu, 11 Apr 2024 07:45:09 GMT</pubDate>
            <atom:updated>2024-04-11T08:15:15.961Z</atom:updated>
            <content:encoded><![CDATA[<h3>Inform: A voter literacy resource center for navigating online political influence and digital election campaigns</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*p4n8jjxYI4vQv3D2Rytsng.png" /><figcaption>Illustration by Tactical Tech</figcaption></figure><p>In a world where technology shapes every aspect of our lives, understanding its influence on democratic processes and elections has never been more crucial. This holds particular relevance in a year when over 60 elections are scheduled across the globe, defining the future of millions of people worldwide. But how can voters engage in discussions about the impacts of digital and data-driven technologies on political campaigns and elections?</p><p>Introducing<a href="https://influenceindustry.org/en/inform/"> <strong>Inform: A voter literacy resource center</strong></a><strong> </strong>designed to empower voters, educators, facilitators, and organisations involved in voter education to explore the ways digital and data-driven technologies play a key role in political campaigns and elections. The <strong>Inform: A voter literacy resource center</strong> offers customisable workshop curricula and activities that can be tailored to the specific audience’s needs to spark conversations, create spaces for debate and develop the skills of facilitators and voters, including:</p><ul><li>The <a href="https://influenceindustry.org/en/inform/resources/voters-guide-workshop/">Voter’s Guide Workshop</a>: a comprehensive plan and workshop curriculum to support participants in learning about the foundations of digital and political influence.</li><li><a href="https://influenceindustry.org/en/inform/resources/voters-guide-workshop/">Workshop Slides</a> that offer visuals and text to the participants.</li><li>Stand-alone <a href="https://influenceindustry.org/en/inform/resources/voters-guide-activities/">activities </a>centred on engaging voters playfully on digital and political influence.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*eQlqukMD795n7oEp" /></figure><p>Illustration by Tactical Tech</p><p>The <a href="https://influenceindustry.org/en/inform/"><strong>Inform: A voter literacy resource center</strong></a><strong> </strong>also includes a selection of resources that voters can use, such as our playful online personality test, which demonstrates how targeted ads might work, explanatory videos such as “Your Data, Our Democracy”, and a practical how-to for voters, the Data Detox Kit’s A Voter’s Guide.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fplayer.vimeo.com%2Fvideo%2F469701415%3Fapp_id%3D122963&amp;dntp=1&amp;display_name=Vimeo&amp;url=https%3A%2F%2Fvimeo.com%2F469701415&amp;image=http%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F977848947-f59afec15f65754b9a6ef9b9d3469d8937b9ec0089945618a1549eb97886810c-d_1280&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=vimeo" width="1920" height="1080" frameborder="0" scrolling="no"><a href="https://medium.com/media/b6e6916bee9183302fe2e29afe303358/href">https://medium.com/media/b6e6916bee9183302fe2e29afe303358/href</a></iframe><p>Through the journey, voters who access the resources or who participate in the workshops facilitated using the materials will:</p><p><strong>Stay Informed:</strong></p><ul><li>Knowledge is power! By understanding the tools and methods used in the influence industry and broad data collection, voters will be better prepared to engage with these topics or recognize their own knowledge gaps for further exploration.</li><li>Those interested in delving deeper into these topics will have a starting point of key terms, introductory materials, and resources to enhance their political agency and digital protection.</li></ul><p><strong>Take Action:</strong></p><ul><li>Participants will be able to extend their knowledge into personal and political action.</li><li>Actions can involve community engagement, reflective and thoughtful interaction with online advertisements, or protective security on digital devices.</li></ul><p><strong>Express Voter Opinions:</strong></p><ul><li>Given the evolving nature of these themes, technologies, and power dynamics, technology users should approach discussions in an open, safe, and informed manner.</li><li>Discussing politics may be challenging for some, and these resources offer playful, private, and expressive forms to develop a personal understanding of political systems, which is essential for active citizenship.</li></ul><p>Explore <a href="https://influenceindustry.org/en/inform/"><strong>Inform: A voter literacy resource center</strong></a><strong> now!</strong></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b3412a03dc10" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Co-Creation as Methodology for Uncertain Times: Practice, Learnings, and Findings]]></title>
            <link>https://medium.com/@Info_Activism/co-creation-as-methodology-for-uncertain-times-practice-learnings-and-findings-aecd3ed5f451?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/aecd3ed5f451</guid>
            <category><![CDATA[education]]></category>
            <category><![CDATA[youth]]></category>
            <category><![CDATA[co-creating]]></category>
            <category><![CDATA[digital-literacy]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Fri, 02 Dec 2022 11:01:55 GMT</pubDate>
            <atom:updated>2022-12-02T11:01:55.503Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vE-2wLEntQsE5nqeoUhG5g.jpeg" /></figure><h3>Introduction</h3><p>In this mini-series, we explore how Tactical Tech’s first youth-centered team co-created the public education intervention <a href="https://theglassroom.org/en/what-the-future-wants/">‘What the Future Wants’</a>, which was released in Spring 2022. <a href="https://tacticaltech.org/news/youth-co-creation-context/">In Part 1</a>, we outlined why we chose co-creation as method to conduct the What the Future Wants workshops. In these times characterised by uncertainty, we found curiosity to be the best way to encourage young people to engage with difficult questions. Here, in Part 2, we dive into the workshops we conducted and share principles and practices that informed the co-creation process and findings from the workshops.</p><p>The What the Future Wants co-creation took the form of a workshop that was designed in three parts (explained below). We worked with six partners in the UK, Czech Republic, Ireland, Slovenia, North Macedonia, the Netherlands (and some associated partners in Germany, Turkey, Greece etc.). Approximately 200 young people (between the ages of 13 and 18) participated in around 10 workshops and were selected by our partners, both through existing networks and open calls. Some of the workshops were delivered by Tactical Tech in English, and in collaboration with the partners, and others were delivered independently by the partners, in their local language.</p><p>The workshop was designed in three parts, three hours each (occasionally shortened where necessary by the partner), which expanded in scale and perspective (individual, group, societal, planetary) as the participants progressed. There were always a facilitator, a note-taker, and a tech supporter. The three parts were:</p><ul><li><strong>Digital Natives:</strong> There are many assumptions about young people and technology, but a lot of the time these assumptions are made from the outside looking in. This session is about young people’s perspective of technology, creating a space for them to share and reflect on experiences of growing up in a digital world, including the good, the bad, and their advice for future generations.</li><li><strong>Digital Influence:</strong> Doom scrolling, profiling, herding — there are many hidden influencers behind screens that shape what individuals see and sometimes how people act. So what can young people do to influence the influencers?</li><li><strong>Digital Ecosystem:</strong> Technology is not neutral, and it is often intertwined with the society in which it’s built. Looking outwards to the wider web of issues that technology is part of, young people try to understand when technology helps and when it harms, and who gets to make those decisions.</li></ul><p>One of the first challenges was understanding the distinction between the different terms surrounding co-creation: ‘co-design’, ‘co-development’, ‘co-production’, which are often interchangeably used. A concern was that we could not claim co-creation if the created outputs were not directly designed and produced by the participants. However, it soon became clear that co-creation can take many forms and does not always result in co-design (the co-creation <em>and</em> design of a specific output within a community). Our approach of co-creation was focused on participatory action research, which we then used as the building blocks for creating the content.</p><p>Interlinked with this challenge was the dynamics of powerwhich are a recurring theme in co-creation analysis: ‘who decides the terms of engagement, what media is made and by whom, and why and who benefits from this type of project.’(1) As the designers of the workshop, and the creators of the resulting exhibition, we had to be very careful to not set the agenda of the space. How can you encourage open and exploratory questions if you do not have a structure or a starting point? We soon found out that a structure is necessary, and within that structure you need activities that act as starting points for thought and exploration.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*TVKQODUp73AOq9hK.png" /><figcaption><em>Visual of “most and least favorite things” as defined by our sample of young people; courtesy of Dominika Knoblochová</em></figcaption></figure><h3>Warm-up: Most and least favorite things about technology</h3><p>As an example of a simple activity used to warm up the sessions, we gave participants space to share with us how they experience technology by stating 1) their <em>favourite</em> <em>thing about technology</em> and 2) their <em>least favourite thing about technology</em>. They could answer both the questions or just one. The answers that emerged gave us a surface-level insight into the worlds they were inhabiting. Some of the <em>favourite</em> things were simplelike ‘sharing’, ‘finding inspiration’, ‘communication’, ‘organisation’, ‘(ability to) simplify tasks’. Others were more profound such as: ‘I don’t know where I’d be if I wasn’t able to create digital art and things like that’, <em>‘_technology can be a great way for people to get educated, to discover who they really are’, and ‘(my favourite thing) community, finding understanding, the possibility of expression’</em>.<em> Their _least favourite</em> aspects of technology revealed the darker side of their day to day experience and included ‘body shaming’, ‘doom scrolling’, ‘when it stops working’, ‘difficult to distinguish fake from reality’ , ‘no accountability’, ‘cyberbullying’ and ‘the most irritating thing for me are the rules that are now in place for the internet. Those are quite old. So yeah, there aren’t really any rules for the internet we have right now.’</p><p>Despite this being a short warm up activity, we came back to the answers time and again to remind ourselves of the where the participants were coming from, how they were defining the issues and what they wanted to protect.</p><h3>Activity: How others see you</h3><p>When we used stereotypes or pre-existing narratives, we used them as a provocation. One activity the ‘Stereotype Spectogram’ used common tropes about teens and technology taken from media headlines: ‘Teenagers are addicted to technology’, ‘Technology harms teenagers’ mental health’ and ‘Teenagers are vulnerable to fake news’. This acted as a counter-space, an opportunity for the participants to orient themselves close to or far away from an assumption that the ‘other’ (the media, parents, teachers) had made about them. Using a spectrogram — a scale with ‘agree’ on one end and ‘disagree’ on the other, and everything in between — we asked the participants to place themselves and then explain their position. For example, ‘teenagers are addicted to technology’ resulted in a vibrant discussion that showed the nuanced and often complicated reality that was not reflected in the blanket headline:</p><ul><li>‘it depends what you mean when you say “addicted” I don’t spend every single waking moment on my phone and I could survive without it for a while but I would not have fun at all, some people are just dependent on it for dopamine tbh’.</li><li>‘Probably true, not exclusive or necessarily a problem’</li><li>‘It’s a problem for many, however it is slightly misunderstood, as it is an evolution of socializing.’</li><li>‘although teenagers are easily influenced by technology, i think that not just teenagers, but adults can get addicted to technology as well.’</li><li>‘as a teenager myself, a lot of times i find myself looking always for some way to reach technology’</li><li>‘the question is whether it is addiction due to necessity or “need”, sometimes technologies provide an escape from unpleasant reality, which can then cause addiction’</li><li>‘I was out for dinner the other day, and there was a couple out to dinner together. Well, that girl was on her phone, and that boy was watching football. And they were just going out to dinner together on a date. And then I think: yes, where is it going.’</li><li>‘So the younger you are, the more dependent on your phone you are and you can’t think otherwise.’</li><li>‘We teenagers can’t live without a phone. A lot happens on the phone, you have friends over the phone. If all your friends are on your phone and you’re not on it, they’ll look at you weird too.’</li></ul><p>A surprising observation was that with all the ‘stereotype’ statements resulted in a distribution of responses across the scale, there was almost never a majority agreement or disagreement to a statement and as the discussion unfolded, some participants shifted their perspective to be more inclusive of somebody else’s experience.</p><p>Similarly, when we introduced the statement ‘technology harms teenagers’ mental health’ there was a broad spectrum of responses, such as:</p><ul><li>‘it’s hard to realize, it’s happening slowly, it’s sneaky’</li><li>‘Technology gives teenagers too much pressure’</li><li>‘For me i have a short attention span because I always look at instagram posts for like 5 seconds and just keep scrolling and so irl i cant stay focused for a long period of time’</li><li>‘I actually think social media educates us about mental health’</li><li>most of teenagers feel anxious just because of technology but at same time it can be exit from anxiety</li><li>like I said I don’t know where I’d be without technology as so many things have helped my life for the better’</li><li>‘I personally think it is very harmful. Because not only that teenagers deal with bullying at school, at the internet too. It also has super models on social media and u compare yourself to them which is mentally harmful.’</li></ul><p>A few insights came from these discussions. In particular, it became clear that there was an awareness about the issues that we presented, even if they were over-simplified. Also, although many of the answers on paper lacked clear nuance, when in discussion with the young participants, there was an implicit nuance emerging, which could then be used to adapt the subsequent activities to a level that challenged the group without pushing them in any direction.</p><h3>Lessons Learned</h3><p>Setting the expectations, from both the facilitator and participant perspective, is really important and finding a balance between opportunity and capacity is a good starting point. Many opportunities may arise from doing a co-creation process: perhaps there is an unexpectedly high level of engagement and commitment from the participants, or various networks are created as a result of the collaborations, and this creates the opportunity for further collaboration and engagement.</p><p>At this point it’s important to ask: how much time commitment would that require? And do we have the capacity to carry out this additional collaboration in a sustainable and constructive way? To avoid unsustainable or aimless co-creation collaborations: communicate clearly (and early on) what is expected from the participants; decide on a start date and an end date for the co-creation process, or if there is no clear end-date, then state the reason for that; when new opportunities for collaboration present themselves, reflect on if there is enough capacity and time and, if they go ahead, then who is best suited to take them forward. It is very possible that at the point of closure, the participants can continue a form of co-creation or peer-to-peer learning independently.</p><p>Another challenge and learning was the documentation of the workshop. As the workshops were carried out by multiple partners in different formats, iterations and languages, we were not always able to thoroughly document the findings. An advantage of doing the workshops online was that it was much easier to capture the conversations and outcomes from group activities because they were centralised in one place, and on one platform. We also had one note-taker.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*oaCzqWeIloFp1xXS.jpeg" /><figcaption><em>What the Future Wants exhibition set-up; photo courtesy of La Loma</em></figcaption></figure><p>With the help of six partners, we ran around 10 of these co-creation workshops over the course of approximately 4 months in 2021, engaging approximately 200 young people (between the ages of 13 and 18). After compiling the results of the workshops, we at Tactical Tech analyzed common threads and particular areas of interest that recurred among the young people in these sessions. As it goes with Tactical Tech projects, the next steps involved several rounds of conceptualisation and iterations of our new resources. Some of the What the Future Wants materials matched with and were adapted from The Glass Room resources, while others were new and required research, development, prototyping and testing. Once the materials were finalized, the localization process followed. In Spring 2022, we launched <a href="https://theglassroom.org/en/what-the-future-wants/"><em>What the Future Wants</em></a> and began the next steps of building additional partnerships to see how the materials could be used in communities around the world.</p><p>- —</p><p><em>Original essay written by Daisy Kidd. Thanks to Dominika Knoblochová for coordination and support during the co-creation process. Reformatting and edits into a mini-series by Safa Ghnaim, Stefanie Felsberger, and Christy Lange. If you are working on similar topics and want to collaborate, we would love to hear from you. Please write to the team at youth@tacticaltech.org.</em></p><p><em>- —</em></p><p>F ootnotes:</p><ul><li>Introduction and Overview by Katerina Cizek and William Uricchio; May 16, 2019; Works In Progress by MIT Press; <a href="https://tacticaltech.org/news/_ftnref1">https://wip.mitpress.mit.edu/pub/collective-wisdom-exec-sum-biographies/release/3</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=aecd3ed5f451" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[[A partner’s interview] Soledad Magnone, Director of the JAAKLAC iniciativa]]></title>
            <link>https://medium.com/@Info_Activism/a-partners-interview-soledad-magnone-director-of-the-jaaklac-iniciativa-6a312dc7d25?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/6a312dc7d25</guid>
            <category><![CDATA[digital-literacy]]></category>
            <category><![CDATA[digital-technology]]></category>
            <category><![CDATA[latinoamerica]]></category>
            <category><![CDATA[youth]]></category>
            <category><![CDATA[education]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Thu, 01 Dec 2022 11:46:20 GMT</pubDate>
            <atom:updated>2022-12-01T11:46:20.960Z</atom:updated>
            <content:encoded><![CDATA[<p>Tactical Tech interviewed <strong>Soledad Magnone, director of </strong><a href="https://jaaklac.org/"><strong>JAAKLAC</strong></a>, who shared insights into the work of the organisation and the collaboration with Tactical Tech.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*xU-vG51_pjKYlVMz.png" /><figcaption><em>Logo courtesy of JAAKLAC</em></figcaption></figure><p><strong><em>Tactical Tech: How do you imagine a different digital future?</em></strong></p><p>Soledad Magnone: I imagine a future in which the digital ecosystem is framed by human rights and centred on our younger generations, especially from the Majority World. This would be manifested by implementing policies, programmes and practices on critical digital education that foster the protection and active participation of children, adolescents and youth.</p><p><strong><em>TT: What is the JAAKLAC’s mission?</em></strong></p><p>SM: JAAKLAC aims at bridging digital divides in learning that are defining our information society and amplifying a social inequality crisis worldwide. To this end, the initiative advocates for a quality education tuned with the challenges and opportunities of the digital age. Through collaborative projects, JAAKLAC designs and researches the possibilities of <a href="https://jaaklac.org/blog/nominorfutures/">Critical Digital Education</a> (CDE), which nurtures individuals’ understanding of how technologies affect societies and the environment and promotes collective action to manifest fairer digital societies.</p><p><strong><em>TT: Could you share some insights into the collaboration with Tactical Tech?</em></strong></p><p>SM: Tactical Tech has a wealth of resources on digital security, privacy, misinformation and well-being in accessible formats and various languages. The main strengths of these resources pertain to their design in collaboration with partners of diverse backgrounds from around the world. <strong>Resources such as the Data Detox Kit, What the Future Wants and the Glass Room have been instrumental in implementing JAAKLAC’s projects.</strong> These have been used to (a) facilitate discussions around the implications of digital technologies, (b) present practical solutions accessible to broader audiences, and © inspire communities to build upon Tactical Tech’s materials and start alternative initiatives.</p><p><strong><em>TT: What essential collaboration results would you like to highlight?</em></strong></p><p>SM: <a href="https://medium.com/@Info_Activism/detox-de-datos-latinx-campaigning-with-characters-in-latin-america-the-caribbean-b26a8a8c59ce">In recent years, we have collaborated with Tactical Tech to implement social media campaigns and ideate plans for community exhibitions in Latin America with and for youth. </a>We have raised awareness and facilitated educational resources on various pressing issues in the region stemming from the intersections between human rights and digital technologies. Secondly, the projects enabled a platform for Latin American digital rights organisations to co-create with schools and youth groups. Finally, the projects have encouraged young people and civil society organisations to reimagine and better tailor Tactical Tech’s resources towards the region’s values, wants and needs.</p><p><strong><em>About the Saga Detox de Datos Online</em></strong></p><p>Saga Detox de Datos Latine is a continuation of the work that began with <a href="https://jaaklac.org/blog/sagadetoxdatoslatine/">Detox de Datos Latinx </a>in the previous year. This time, JAAKLAC invited digital rights and cybersecurity activists from across the region to work with youth and educational organizations to develop the resource. They ran workshops focusing on co-creation. The results are creative and mesmerizing visuals and stories that engage readers in privacy, security, and well-being in unique and memorable ways.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*i34dwJGOSPH7h_Zi.gif" /><figcaption><em>Overview of Saga Detox de Datos Latine, courtesy of JAAKLAC</em></figcaption></figure><p>You can read more about the campaign <a href="https://jaaklac.org/blog/sagadetoxdatoslatine/">at this link</a>. Watch explanatory videos in <a href="https://www.youtube.com/watch?v=xiKBd4HapFA&amp;list=PLKQkcC_PX8sRkee71wWoHwPVCTCpNIuqq">their YouTube playlist</a>.</p><p><strong><em>TT: How did your project and the partnership make a difference?</em></strong></p><p>SM: JAAKLAC’s strategy and activities are aimed at elaborating CDE practices and showcasing pathways to inspire policymakers, technologists, researchers and activists to collaborate with children and youth. For this, JAAKLC’s projects comprised a roadmap of activities for youth and organisations to Share, Learn and Do It Together. These entailed a series of online workshops (Oficinas) and other hands-on activities that were dedicated to recognising participants’ lived experiences and diverse viewpoints around the harms and benefits emerging from the digital ecosystem. The Oficinas are spaces for horizontal dialogue that prototype multi-stakeholder digital governance centred on children, adolescents and youth. In this sense, Tactical Tech’s partnerships have supported advancement in the co-design and outreach of this approach and activities in Latin America.</p><p><strong><em>TT: Why is it essential to continue these collaborations and partnerships?</em></strong></p><p>SM: Tactical Tech’s partnerships support effectively tailoring and disseminating educational resources dedicated to issues and groups consistently overlooked in global digital agendas. These cooperations have been fundamental to nurturing synergies between the knowledge and projects of Tactical Tech and organisations from different geographies and backgrounds. The collaborations have represented significant contributions to Tactical Tech’s initiatives beyond efforts to translate resources. These have opened up opportunities to repurpose the resources and bring about creative ideas that boost global movements by and for those most acutely affected by the shortcomings of digital solutions and manifest a material change.</p><p>Find out more about <a href="https://jaaklac.org/">JAAKLAC Iniciativa</a> and <a href="https://jaaklac.org/blog/causas-digitales-2020/">Causas Digitales</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6a312dc7d25" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Co-Creating a Public Education Intervention: A Contextual Analysis]]></title>
            <link>https://medium.com/@Info_Activism/co-creating-a-public-education-intervention-a-contextual-analysis-15a45e224300?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/15a45e224300</guid>
            <category><![CDATA[education]]></category>
            <category><![CDATA[education-technology]]></category>
            <category><![CDATA[co-creating]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Thu, 01 Dec 2022 10:26:43 GMT</pubDate>
            <atom:updated>2022-12-01T10:42:06.173Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Y-P2oDEUX33shU_5rkuYQg.jpeg" /></figure><h3>Introduction</h3><p>What the Future Wants is a project about growing up in a digital, quantified world. But what does it <em>really</em> mean to grow up during a time of unprecedented technological change and acceleration? By the time children become young adults, they are already familiar with the idea that their actions, thoughts, beliefs and emotions are data fodder for tech companies; that their circle of friends is hundreds or even thousands of people wide, including strangers and bots; that not having access to technology for a few days or hours can cause symptoms of withdrawal and panic; and that their dream job, or the jobs of their parents, may be carried out by a computer programme by the time they graduate. This is just a snapshot of the many ways that digital technologies shape young peoples’ lives.</p><p>We are approaching a time of critical awareness about the impact technology has on the lives of young people. In 2021, the UN Convention on the Rights of the Child acknowledged for the first time that <a href="https://www.ohchr.org/EN/HRBodies/CRC/Pages/GCChildrensRightsRelationDigitalEnvironment.aspx">children’s rights apply in the digital world</a>, and students around the world are protesting the use of <a href="https://www.theverge.com/2020/8/17/21372045/uk-a-level-results-algorithm-biased-coronavirus-covid-19-pandemic-university-applications">algorithmic decision making</a> and <a href="https://www.vice.com/en/article/7k9ag4/schools-are-abandoning-invasive-proctoring-software-after-student-backlash+">discriminatory technologies</a> in their schools. Meanwhile, pressure is mounting for social media companies to address the <a href="https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739">unique problems</a> that young people face when using their platforms. And yet, many of the problems persist and some are getting worse, especially in the context of the pandemic and the technological dependency that it creates.</p><p>What the Future Wants was born from the emerging need for critical thinking resources designed for young people, to help them understand the new and existing issues that impact them in their digital lives. What are the biggest concerns impacting young people? How can they be addressed? And how can young people become part of the effort to find solutions for themselves and for future generations?</p><p>These questions were the starting point for us at Tactical Tech, a Berlin-based NGO. For the past two decades, we have been working on how to support people to understand, unpack and reflect on technology’s place within society. Over the years we have received regular requests — from parents, educators and others working with young people — to bring our resources into educational environments. Over time, we observed the potential of these projects to give young people critical and thought-provoking tools for understanding the digital world. In 2020, we adapted the Data Detox Kit into a <a href="https://datadetoxkit.org/en/families/datadetox-x-youth">workbook for young people,</a> which has since been translated into 15 languages and has been used by parents, educators and trainers to start conversations with young people around the world.</p><p>In 2021, we were given the opportunity to adapt our exhibition about how technology is shaping society, The Glass Room, for young people, which became the What the Future Wants project. Since it’s inception, The Glass Room has attracted people of all ages. Some of the most curious visitors we have had to The Glass Room were teenagers, maybe dragged in at first by a school trip or parents, but often staying longer than other visitors and asking meaningful and engaging questions. In 2019, a 14-year-old student in the Hague had seen The Glass Room at a conference that she had visited with her father and contact to us asking if she could host it at her school to teach her peers. Since then, the intervention has been hosted at multiple schools, libraries and institutions.</p><figure><img alt="Two colleagues hanging up one of the What the Future Wants posters; photo courtesy of Daisy Kidd" src="https://cdn-images-1.medium.com/max/1024/0*KiCQWBrRqQtEPnjS.png" /><figcaption><em>Two colleagues hanging up one of the What the Future Wants posters; photo courtesy of Daisy Kidd</em></figcaption></figure><p>As we found out, the way youth view technology is complex, and it is not always in line with how ‘Generation Z’ are commonly understood and represented in the media. In this two-part series, we explore how Tactical Tech’s first youth-centered team co-created the public education intervention ‘<a href="https://theglassroom.org/en/what-the-future-wants/">What the Future Wants</a>,’ which was released in Spring 2022. The series outlines the initiation, delivery, and findings of the project (so far) in the hope that other organisations, initiatives, and individuals working with young people can learn from our experience and the rich insights of all of the young people who contributed.</p><p>In Part 1, we outline why co-creation as a method lies at the heart of our approach to a series of workshops and explain the guiding principles behind it. <a href="https://tacticaltech.org/news/youth-co-creation-findings/">In Part 2</a>, we go into detail about how we implemented these principles in practice.</p><h3>Uncertainty or Being okay with not knowing the answers</h3><p>The culture of uncertainty is particularly relevant in the context of the digital worlds that young people inhabit. From the outside, it may seem that young people are the masters of digital technologies, traversing new platforms, trends, and skills with ease and confidence, at a speed that is hard to keep up with. It can be alienating to enter into the world as an outsider and to find a whole generation of people who just know how this stuff works; When did they learn? And who taught them that?</p><p>However, to assume that young people are ‘in the know’ misses an important point: the internet and digital technologies were never designed for children(1)<a href="https://tacticaltech.org/news/youth-co-creation-context#_ftn1"> </a>— in the same way that centuries of human knowledge and understanding led to safe and nurturing environments for children prior to the internet. Their knowledge about this digital space is self-taught and has emerged out of a necessity to take risks, belong and explore like any teenager has the right to do. The only difference is that they are developing in a new territory that is not well mapped out and so the risks and challenges that present themselves are unprecedented, to them, and others.</p><p>In his book, ‘New Dark Age’, James Bridle writes: ‘Over the last century, technological acceleration has transformed our planet, our societies, and ourselves, but it has failed to transform our understanding of these things.’(2)Here Bridle identifies a crucial characteristic of our digital age: it is rooted in uncertainty, human understanding and comprehension has not caught up with the advancement of technologies and the myriad ways that they impact us and our surroundings. This is in part because of the normalisation of technologies. As they fit seamlessly into our everyday lives — the algorithm that decides what song you’ll listen to next, the psychological tuning that makes you stay online for hours, the supply chain that delivers a new iPhone model every year — these systems and mechanisms expertly meet our evolving needs, making it easy to stop questioning them. Even at a time when a simple search query can generate thousands or millions of possible avenues of knowledge, these vast swathes of information can leave us feeling paralyzed(3).</p><p>We do not fully understand the complexity of what this might mean for the next generations as these types of transformative times are rooted in uncertainty and driven by a constant state of flux. But within the uncertainty, there is an opportunity for new modes of thinking that leans towards the uncertainty and explores it with curiosity. As is written in the introduction of the ‘Framework for Information Literacy for Higher Education’(4), from the Association of College and Research Libraries: ‘Students have a greater role and responsibility in creating new knowledge, in understanding the contours and the changing dynamics of the world of information, and in using information, data, and scholarship ethically.’</p><h3>Curiosity as a form of critical thinking</h3><p>Since its inception, The Glass Room exhibition has provided a space for people to be curious about technology and to explore its impact on the world. It is not a space to provide answers, but rather to pose questions and a forum for discussion. The objects on display are sometimes playful, cheeky, challenging, but more importantly, they tell a story about technology through a unique lens, giving people an entry point into a topic that may have previously seemed impenetrable.</p><p>Learning from The Glass Room’s breadth of exhibition experiences, we knew that this approach of curiosity as a means for critical thinking works well with young people. The idea for co-creation was taking that concept one step further so that instead of engaging the youth perspective at the point of visiting, the curiosity mindset was built into the initiation and conceptualisation of the project. The curiosity mindset works both ways. We entered into the project with an open mind, ready to learn from the participants, trying to shed our preconceived notions that youth were addicted to their phones and unaware of data collection and targeted advertisements, and open to the possibility that everything we knew before may be turned on its head. We also wanted to encourage a curious and open mindset from the participants, towards their individual perspective on the issues at hand, like algorithmic curation and the role of influencers, but also towards each other, to allow the different cultures, backgrounds, genders, ages to promote learning and connection rather than to deepen differences.</p><figure><img alt="Visual of topics of interest clustered in a web; courtesy of Dominika Knoblochová" src="https://cdn-images-1.medium.com/max/1024/0*uJGnm39jgw6HnZXx.png" /><figcaption><em>Visual of topics of interest clustered in a web; courtesy of Dominika Knoblochová</em></figcaption></figure><p>An insightful article that sheds light on this concept is from Barabara Fister as part of the Project Information Literacy Series. In article titled ‘Principled Uncertainty: Why Learning to Ask Good Questions Matters More than Finding Answers’(5), she poses the question: ‘What can we do as educators to encourage the ethical practice of curiosity, not just for times of uncertainty but as an everyday habit?’ Her answer, to paraphrase, is that we must encourage students to lean into uncertainty rather than to retreat into the familiar. In doing this, students learn the skill of asking open-ended, exploratory questions that have no definitive answer. The learning then is in the pursuit of uncertainty, rather than the adoption of established ideas. As Fister writes: ‘An approach to uncertainty grounded in curiosity invites students to claim their own authority as they formulate their understanding. If we support them when they venture into the unknown and give them the tools to move forward with integrity, they will be able to explore territories their teachers haven’t already mapped.’</p><h3>Methods and Principles of Co-Creation</h3><p>In a <a href="https://wip.mitpress.mit.edu/collectivewisdom"><em>study</em></a> from MIT’s Co-Creation Studio, the researchers define co-creation as “an alternative to the single-author vision and involves a constellation of media production methods, frameworks, and feedback systems. In co-creation, projects emerge from process and evolve from within communities, with people, rather than for, or about these communities.”(6)</p><p>The MIT study identifies four types of co-creation: ‘within communities (real world and virtual), across disciplines, and humans working with non-human systems.’ The What the Future Wants project followed type one: co-creation in communities, in person(7). What became clear from researching co-creation methodologies is that the core principle of co-creation is that the activity happens in collaboration with a strong element of listening and inclusion.</p><p>The practice of co-development and co-creation is not new, but it has risen in popularity within the context of media and storytelling. <strong>What does an alternative to a single-author vision mean in the context of youth, data and technology?</strong> Firstly, it creates a space for multiple perspectives — and importantly different perspectives — so that the blind spots in planning for uncertain digital futures can be inclusive of younger generations. And secondly, it allows for a deeper level of critical thinking and action around technology, resulting from participants’ ownership and self-authorship of the topics explored.</p><figure><img alt="Caption: Diagram of core methods and principles; visual by Yiorgos Bagakis" src="https://cdn-images-1.medium.com/max/1024/0*hNovxMssrBJzjEVO.jpg" /><figcaption><em>Diagram of core methods and principles; visual by Yiorgos Bagakis</em></figcaption></figure><p>Despite having extensive experience with partnerships and collaborations, Tactical Tech had not carried out co-creation with this age group before and so most of our approach was guided by partners, existing research and methodologies. In particular, we followed defined core methods as well as core principles. Our co-creation approach was based on these core methods:</p><ul><li><strong>Participatory action research:</strong> A form of research that emphasises participation and action by members of communities affected by that research. It often involves co-creation as a way of understanding the world from that community’s perspective.</li><li><strong>Peer learning:</strong> Peer learning is a self-directed, collaborative form of education where people assemble collective wisdom to reach learning outcomes.</li><li><strong>Design thinking:</strong> A non-linear process to challenge assumptions, redefine problems and create innovative solutions to prototype and test, share tools and best practices. When we actively listen we suspend our own thought processes and give the person speaking our full attention. We make a deliberate effort to understand someone’s position and their underlying needs, concerns and emotions.</li></ul><p>Furthermore, our co-creation approach was based on these core principles:</p><ul><li><strong>Curiosity as critical thinking:</strong> As stated above ‘an approach to uncertainty grounded in curiosity invites students to claim their own authority as they formulate their understanding.’ This meant allowing plenty of space for participants to mull over ideas and ask big questions.</li><li><strong>Pursuit of uncertainty:</strong> If the participants came into the workshop knowing that we knew the answers then their engagement would be very different<strong>.</strong> Therefore, we adopted The Glass Room’s method of not providing answers but instead giving people a space to ask questions. In this light, we were facilitators of the space, to allow a safe and exploratory pursuit of big questions.</li><li><strong>Equity and inclusion:</strong> This meant agreeing on common terms, clearly spelling out decision-making, ownership and governance issues.</li><li><strong>Process orientated:</strong> Whilst the outcomes of the workshop fed into the development of the What the Future Wants exhibition, this was not the core focus of the workshops.</li></ul><p>With What the Future Wants, we are trying to find those blind spots, to pay attention to the insights and experiences of young people as they explore the unknown, uncertain future that is technology.</p><p>Read on to <a href="https://tacticaltech.org/news/youth-co-creation-findings/">Part 2</a> to learn about how we implemented these principles in a series of workshops that went into co-creating the What the Future Wants exhibition.</p><p>- —</p><p><em>Original essay written by Daisy Kidd. Thanks to Dominika Knoblochová for coordination and support during the co-creation process. Reformatting and edits into a mini-series by Safa Ghnaim, Stefanie Felsberger, and Christy Lange. If you are working on similar topics and want to collaborate, we would love to hear from you. Please write to the team at youth@tacticaltech.org.</em></p><p><em>- —</em></p><p>Footnotes:</p><ul><li>“Online safety: Internet ‘not designed for children’“, BBC News, 5th January 2017, <a href="https://www.bbc.com/news/education-38508888">https://www.bbc.com/news/education-38508888</a></li><li>“New Dark Age” by James Bridle, Verso, 2018</li><li>“The sheer amount of information about every current idea makes those concepts difficult to contradict, particularly in the framework where public consensus has become the ultimate arbiter of validity. In other words, we’re starting to behave as if we’ve reached the end of human knowledge. And while that notion is undoubtedly false, the sensation of certitude it generates is paralyzing.” — Chuck Klosterman, But What If We’re Wrong?</li><li>Association of College and Research Libraries, January 11, 2016, “Framework for information literacy for higher education,” <a href="https://www.ala.org/acrl/standards/ilframework#inquiry">https://www.ala.org/acrl/standards/ilframework#inquiry</a></li><li>Barbara Fister, February 16, 2022, “Principled Uncertainty: Why Learning to Ask Good Questions Matters More than Finding Answers,” PIL Provocation Series, 2(1), Project Information Literacy Research Institute, <a href="https://projectinfolit.org/pubs/provocation-series/essays/principled-uncertainty.html">https://projectinfolit.org/pubs/provocation-series/essays/principled-uncertainty.html</a>.</li><li>Co-Creation Studio, MIT, COLLECTIVE WISDOM: Co-Creating Media within Communities, across Disciplines and with Algorithms, <a href="https://wip.mitpress.mit.edu/collectivewisdom">https://wip.mitpress.mit.edu/collectivewisdom</a></li><li>Many of the What the Future Wants co-creation workshops ended up being delivered online because of school closures and pandemic restrictions.</li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=15a45e224300" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Defining Political Influence after the Russian Invasion of Ukraine]]></title>
            <link>https://medium.com/@Info_Activism/defining-political-influence-after-the-russian-invasion-of-ukraine-f0ede624f7a9?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/f0ede624f7a9</guid>
            <category><![CDATA[politics]]></category>
            <category><![CDATA[personal-data]]></category>
            <category><![CDATA[influence]]></category>
            <category><![CDATA[russia]]></category>
            <category><![CDATA[ukraine]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Mon, 21 Mar 2022 15:56:17 GMT</pubDate>
            <atom:updated>2022-04-01T09:46:49.495Z</atom:updated>
            <content:encoded><![CDATA[<p><em>by Amber Macintyre, Project Lead, The Influence Industry Project</em></p><p>The Russian invasion of Ukraine has prompted many of us to ask ourselves: what can we do right now, how might our future change, and what could we have done? For those of us working on digital influence, whether on political communications, misinformation campaigns, or data-driven marketing, this last question is particularly pertinent given that information on Russian influence campaigns is not new. The Oxford Internet Institute reported that since at least 2014 there has been evidence of Russian state actors conducting influence campaigns in Ukraine, describing it as “the most globally advanced case of computational propaganda.”<a href="#sdfootnote1sym">1</a> Researchers and practitioners of digital influence alike must consider with hindsight: have we been acting on our knowledge as well as we could have? Have our recommendations and responses been appropriate, and was there evidence we should have taken more seriously?</p><p>In this article, we re-examine our worldwide research on political influence to understand what has framed our work so far, and how our approach might change. We identify the shifting features of political influence, in practice and ethics, in light of the severity of the impact of Russia’s persistent disruptive influence campaign. We conclude with a consideration of what this means for the role and responsibilities of civil society. Our analysis reveals how our work will continue to reinforce the need for secure data practices in political campaigns, question the role of the industry working on political influence, and adapt our resources to include civil society’s emerging responses to disruptive influence techniques.</p><p><strong>Data used for influence can change hands</strong></p><p>Political influence worldwide is conducted on the basis of personal data that is collected, analysed and hosted in databases held by political groups, private companies and platforms.<a href="#sdfootnote2sym">2</a> This data can change hands, and when it does, its purpose and meaning can change too. In Nigeria, data held by phone companies have been used by the government in the name of national security, and in turn for political election campaigns.<a href="#sdfootnote3sym">3</a> In the US, public administrative data is used by political parties to profile their audience and target communications.<a href="#sdfootnote4sym">4</a> In Ukraine, data collected during campaigning in the run up to an election became government data when the political party won in 2019.<a href="#sdfootnote5sym">5</a></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8dx4eY-Ydbd_PjTILzEdvA.jpeg" /></figure><p>While it is clear that data can change hands and purposes quickly, what we don’t know yet is how data might change hands, and meaning, in this war. Political groups in Ukraine and private firms hold datasets representing the level of support for political parties during the last election in 2019. Datasets held by political parties who conducted campaigning may also contain developed profiles of individuals. For example, in the run up to the election, the Volodymyr Zelensky campaign had “segmented its audience into 32 categories according to age, gender, professional affiliation, or political interest.”<a href="#sdfootnote6sym">6</a> These profiles are usually connected to the proposed political leaning of the person based on the data available about them and may identify people by key sensitive features of their identity such as LGBTQ+, religious group, or economic status including whether a migrant. Furthermore, data representing the followers of political parties’ social media profiles is not only owned by the platform, but also often names and profile photos are visible to anyone in the public.</p><p>These datasets are politically sensitive and the protection of the data can be weak. Privacy policies of most organisations are usually vague and ambiguous with an open interpretation of how data could be used. Sometimes there is no privacy policy altogether: in Ukraine, only two of the parties in the last election had privacy policies on their websites.<a href="#sdfootnote7sym">7</a> Further, there are very few privacy policies on any platform that have established procedures for how data will be processed in the case of a conflict-based crisis. Even when there is a privacy policy, sometimes the software isn’t adept at keeping data secure: in one of the campaign apps in Uganda a little technical knowledge opened up access to users’ uploaded photos and related metadata such as location or model of their phone.<a href="#sdfootnote8sym">8</a></p><p>The risk of data changing purposes is one that motivates the work of digital security experts and privacy campaigners. The sudden change of political environment in Ukraine, and the much more open threat to the current leadership, is a reminder that we must also have consideration for how data is stored, archived, and repurposed. If the parties host the data, they could delete it quickly to eliminate inappropriate use but they could also use it in their current communications framed by conflict. If another group ends up in control of the data, they could use it to track or persecute people who have been identified as unsupportive of their goals. Influence practices that are ‘routine’ or ‘safe’ can create data that may be repurposed in dangerous ways. Anyone working with data in digital campaigns for politics or civil society should take care of what data they choose to collect, how they choose to store it, and what they will do with it in the future.</p><p><strong>Technologists change roles</strong></p><p>Just as datasets are repurposed, so are the skillsets of the technologists who manage the data. Technology professionals regularly adapt their skills in new contexts between companies and causes. Furthermore, there has been a notable crossover of knowledge and personnel between political communications and for-profit marketing.<a href="#sdfootnote9sym">9</a> In our research, we found that there are many people working at Facebook who had previously worked in either government relations, policy or security. For example, the previous Deputy Prime Minister in the UK, Nick Clegg, is now the President of Global Affairs at Meta, formerly Facebook.<a href="#sdfootnote10sym">10</a> The exchange of personnel between private firms working within political influence and the political causes or candidates themselves leads to an exchange of knowledge, values and skills, which may be of great advantage to a new context but may also clash with the needs of their new roles.</p><p>The role of a technologist can change within a single context. In Ukraine, Mykhailo Fedorov was the chief digital strategist for Zelensky’s election campaign in 2019. Once Zelensky was elected, Mykhailo Fedorov was appointed Minister of Digital Transformation and Vice Prime Minister. This role has adapted again to the context of war as Fedorov runs Ukraine’s “formidable war machine”<a href="#sdfootnote10sym">10</a> with a team of skilled technologists. The team are working on a public campaign targeted at technology firms to boycott Russia in whichever way they can, receiving donations in cryptocurrency, and using civic participation to submit images and videos of the Russian military’s movements.</p><h3>Mykhailo Fedorov on Twitter: &quot;Thank you for the first steps for peace, but @Microsoft @Azure @BillGates @satyanadella, you have to do more to stop the war. Stop supporting services and products in Russia as long as their tanks and missiles kill Ukrainians! / Twitter&quot;</h3><p>Thank you for the first steps for peace, but @Microsoft @Azure @BillGates @satyanadella, you have to do more to stop the war. Stop supporting services and products in Russia as long as their tanks and missiles kill Ukrainians!</p><p>The nature of crisis can change what is asked of technologists, and their role within politics: for example, they may change affiliation (either between groups or from a ‘neutral’ company to a partisan one) or come under new forms of pressure and consequently share, transfer or leak data. We have to ask: how can the experience of a technologist in one context, such as a marketing firm or a political election campaign, benefit or hinder influence within a war? Are there conflicts of interest when a technologist changes role or company such as focusing on profit? When technologists change roles, do they take their data with them, which could contribute to the intelligence they can use towards their new goals? And finally, what technical training or political education do those of us working with and on digital influence need to account for this flexible environment between politics, security, and marketing?</p><p><strong>Tools and tactics of influence</strong></p><p>The practice and theory of what tools are used for political influence has always been framed by often dichotomous judgements on current events: Barack Obama’s presidential election campaign’s data team and tactics were considered a prototype for many political campaigns;<a href="#sdfootnote11sym">11</a> tech-driven influence campaigns in the run up to elections in Nigeria and Kenya lead to an uptake of the same tactics in Gambia and Senegal;<a href="#sdfootnote12sym">12</a> Cambridge Analytica and the polarising tactics of the Brexit campaign in the UK focused scrutiny on the influence industry; the Arab Spring prompted many organisations to examine the role of social media in protests and uprisings. Now, Russia’s invasion of Ukraine prompts us to examine the research on digital influence that we have conducted so far to iterate our understanding once again.</p><p>Many of the routine tactics of election and political campaigns remain relevant in a crisis, including segmenting, profiling and targeting individuals with personalised communications on Facebook or WhatsApp.<a href="#sdfootnote13sym">13</a> However, the tactics in crisis, conflict, and disruptive campaigns also differ from routine political election campaigns. In the run up to Russia’s military invasion of Ukraine, and during, some of the tactics are deemed as unacceptable, outside the norm, and anti-democratic approaches to political influence, such as bots, trolls and misinformation. Regarding Russia’s influence in Ukraine, there are substantial demonstrations of these different techniques. For instance, more than 2,000 Facebook profiles in Ukraine were connected to a Russian ‘profile farm’, other Facebook pages were also found spreading misinformation, and one investigation found offers from a bot to create hundreds of fake accounts on Facebook and leaving tens of thousands of ‘comments’ in support or against a particular candidate.<a href="#sdfootnote14sym">14</a></p><p>Some tools are seen as neither purely acceptable or unacceptable. For example, negative campaigning is considered acceptable in election campaigns (when pointing out flaws of the opposition or their previous policies), but become contentious when used to encourage distrust in an institution and disrupt political stability. Targeted advertising that recommends products or music may be seen as more acceptable than sensitive and risky political profiling that uses personality types or sentiment analysis. If our views change based on the context of who and when these tools are used, is it worth using these tools at all and if so, what safety measures or regulations do we need to take when using them?</p><p><strong>International Relations and Influence</strong></p><p>One of the substantial features of the current context of influence is that the political groups looking to gain advantage through influence are nation states. State-coordinated information campaigns on their own citizens, or citizens of another nation, shifts the discussion of influence beyond elections and social movements into<em> international relations.</em> The influence tactics also change on this basis too. For example, Russian-linked hackers were connected to attempts to disrupt state electronic infrastructure in Ukraine in 2019.<a href="#sdfootnote15sym">15</a> As Carole Cadwalladr has shown, Russia has exerted influence financially, politically and through social media around the world.<a href="#sdfootnote16sym">16</a> Those of us working with campaigns and influence have to take the term as multi-layered: from social media influencers to promoting a candidate in an election to disrupting national stability.</p><p>Other notable features of the nation-state’s influence tactics is the use of established and traditional media outlets. In Russia, only state-approved media coverage is accessible. China is also only showing Russian-controlled media on the war. On the other hand, many countries across Europe have banned <em>Russia Today</em>. Furthermore, states can control access to social media platforms. Russia has banned Facebook and Instagram within the country. Access to banned sites can be accessed only through the risk of using a VPN, which is illegal in Russia. It is not uncommon in some countries for governing groups to carry out internet or media blackouts around elections or conflicts. The scale and control of information in this war has highlighted how communication strategies must not only deal with the over-saturated media environment of the internet but also with limited, controlled, and censored media environments.</p><p><strong>The Influence Industry: The money in influence</strong></p><p>Many influence techniques are not instigated by political parties alone, but with assistance from companies who are part of the influence industry. This business is lucrative. During the last Ukrainian election in 2019, it was estimated that parties spent over 1,800,000 US dollars on Facebook ads.<a href="#sdfootnote17sym">17</a> A bot farm that could leave 40,000 comments for or against a candidate could cost up to 20,000 Euros.<a href="#sdfootnote18sym">18</a> In the UK, up to 500,000 GBP was spent on one digital influence consultant in one campaign.</p><p>The public-facing and established side of this industry has been well documented by Tactical Tech.<a href="#sdfootnote19sym">19</a> However, this industry also has a private, sordid face, with smaller and harder to track networks and freelancers. The ubiquitous but underground techniques are less visible and far harder to find. In our research so far, we have only seen hints of fake news pedlars paid in Sub-saharan Africa<a href="#sdfootnote20sym">20</a>, one or two companies advertising the sale of misinformation techniques<a href="#sdfootnote21sym">21</a>, or companies offering objectionable services to undercover journalists posing as political parties. This shady space is much harder to monitor.</p><p>One approach to monitor the role of the industry would be to track their financial spending. However, while 60% of countries must report their financial spending, this is not consistently carried out or easily accessible to monitor.<a href="#sdfootnote22sym">22</a> In these cases other methods might be used. For example, using Facebook’s ad library it was possible to find that in Ukraine’s last election “only three parties of those participating in the 2019 elections declared spending compatible with the amounts indicated in Facebook’s library of political advertisements.”<a href="#sdfootnote23sym">23</a> These quantitative data techniques to investigation are useful when there is data, but interviews or embedded journalists within campaigns can help when there is little public data. Transparency of political influence worsens substantially in conflict and crisis as political influence becomes covered by national security protections.</p><p>Regulation of the transparency measures from the parties and companies themselves could give us an accurate idea of who is spending what services with which companies. In the meantime, investigative methodologies need to be used to understand how many more companies are offering services, what exactly these services entail and their impact on politics, so the industry can be monitored, regulated and held accountable.</p><p><strong>Responses from Regulators and Technology Companies</strong></p><p>Government regulation and technology company policies can be used to some extent to manage the consequences of digital influence. However, it is well recognised that policy and regulation are far behind the needs of the online information environment whether managing omnipresent social media platforms or identifying fake news. In many cases, the issues have been raised but have yet to be acted on. Either because policy takes time, such as the now long-awaited digital services and digital market acts, or, as was the case in Ukraine, because the situations are not “deemed not extensive enough to affect the voting process or the electoral outcome”<a href="#sdfootnote24sym">24</a> to limit the content of disruptive influence campaigns.</p><p>Yet, since Russia began its invasion of Ukraine there has been a substantial sanction-style response from the tech industry: Facebook stopped monetizing or selling ads to Russian state media, Twitter paused all Russian ads in Ukraine, and tech companies have stopped access to or selling their services in Russia.<a href="#sdfootnote25sym">25</a> There are many considerations to examine why the companies did not act to create policies to limit the influence in the first place, including the Euro-centrism of the response to this war. There is also an important question as to whether their response is appropriate and effective now — and this can only become clear after some time has passed. If so (or not), what can we learn from it for international regulation of digital communication spaces as we go forward?</p><p><strong>Responses from Civil Society engaged in Influence</strong></p><p>Many campaigns and civil society groups are involved in their own counter campaigns to the influence techniques of bots, trolls, and misinformation. There are tools and strategies for debunking fake news stories.<a href="#sdfootnote26sym">26</a> Existing resources on digital security are being shared.<a href="#sdfootnote27sym">27</a> One Telegram group called the “IT Army of Ukraine” assigned tasks to hackers to disrupt access to Russian-state websites. People have responded to censored contexts by flooding information where possible: pro-Ukrainian posts have flooded VK, a Russian-language social media platform and Instagram before it was banned.<a href="#sdfootnote28sym">28</a> This content ranges from hard-hitting evidence from the ground to lighted hearted memes.<a href="#sdfootnote29sym">29</a> Building capacity for civil society in influence has always balanced training how to use data-driven tools with the ethical consequences, and now the skills for civil society increasingly need to be able to respond to the disruptive influence techniques.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/491/1*Jq9O320t_A3jV5phb4m3Cg.jpeg" /><figcaption>meme4Ukraine is a Twitter account posting well-known memes adapted for a pro-Ukraine message <a href="https://twitter.com/Meme4Ukraine/status/1501441217556529152/photo/1">https://twitter.com/Meme4Ukraine/status/1501441217556529152/photo/1</a></figcaption></figure><p>Digital influence is a shifting concept, changing both with the available technologies and the way we’ve seen them used. The Russian invasion of Ukraine has reinforced a broader understanding of digital influence that includes disruptive techniques as well as showing the potential severity of situations that influence is used in. The industry, from individual consultants to established technology companies, shape the use of influence technologies and is often working unregulated, inconsistently, and with little transparency. At Tactical Tech, we will continue to build local capacity for investigation and monitoring of digital influence, conduct partnership-based research for an international perspective of digital influence, and develop digital literacy resources on the practices and impacts of influence campaigns. For others working on political influence, our initial analysis leads us to make three recommendations:</p><p>1) Digital influence practitioners including campaigners, social media influencers, and marketers should continue to review their data policies and ensure that the collection, retention, and archiving of personal data for political influence mitigates for a politically turbulent environment in which the security of personal data can be impacted by crisis and conflict.</p><p>2) Investigators looking at the topic of digital influence should examine the impact of the values and practices of the international influence industry including partisan companies, for-profit tactics used in political contexts, political tactics used in national security contexts, and which technologies have a shared use between social movements, election groups, and national security communications.</p><p>3) Digital literacy projects should not only continue to build the capacity of the public and civil society to engage with communication technologies and social media platforms in an effective and privacy-conscious way but also include tactics for how to combat trolls, bots, and misinformation. These tactics are evolving, and for now include how to: find different media sources, verify information, and promote the verified information.</p><p><em>If you want to know more you can check out our work here, sign up to our newsletter, or get in touch with the team.</em></p><p><em>Many thanks for comments and feedback from Marek Tuszynski, Christy Lange, Björk Roi, and Glyn Thomas.</em></p><p><a href="#sdfootnote1anc">1</a> Samuel C. Woolley &amp; Philip N. Howard, “Computational Propaganda Worldwide: Executive Summary.” Samuel Woolley and Philip N. Howard, Eds. Working Paper 2017.11. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk. 14 pp.</p><p><a href="#sdfootnote2anc">2</a> Tactical Technology. “Personal Data: Political Persuasion (How It Works),” March 2019. <a href="https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.">https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.</a></p><p><a href="#sdfootnote3anc">3</a> Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. <a href="https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.">https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.</a></p><p><a href="#sdfootnote4anc">4</a> Hersh, Eitan D. Hacking the Electorate: How Campaigns Perceive Voters. New York, NY: Cambridge University Press, 2015.</p><p><a href="#sdfootnote5anc">5</a> Tetyana Bohdanova, “Personal Voter Data Use in the 2019 Ukrainian Parliamentary Elections: A Report on Digital Influence Outside the Scope of Disinformation”, July 2020, <a href="https://cdn.ttc.io/s/ourdataourselves.tacticaltech.org/The-Use-of-Personal-Voter-Data-During-2019-Elections-in-Ukraine_EN.pdf.">https://cdn.ttc.io/s/ourdataourselves.tacticaltech.org/The-Use-of-Personal-Voter-Data-During-2019-Elections-in-Ukraine_EN.pdf.</a></p><p><a href="#sdfootnote6anc">6</a> Bohdanova (2020)</p><p><a href="#sdfootnote7anc">7</a> Bohdanova (2020)</p><p><a href="#sdfootnote8anc">8</a> Tactical Technology, “The National Resistance Movement App and Digital Politics in Uganda”, April 2021, <a href="https://medium.com/@Info_Activism/the-national-resistance-movement-app-and-digital-politics-in-uganda-558df35b7b48.">https://medium.com/@Info_Activism/the-national-resistance-movement-app-and-digital-politics-in-uganda-558df35b7b48.</a></p><p><a href="#sdfootnote9anc">9</a> Kreiss, Daniel, and McGregor, S., “Technology Firms Shape Political Communication: The Work of Microsoft, Facebook, Twitter, and Google With Campaigns During the 2016 U.S. Presidential Cycle.” Political Communication 35, no. 2 (April 3, 2018): 155–77. <a href="https://doi.org/10.1080/10584609.2017.1364814.">https://doi.org/10.1080/10584609.2017.1364814.</a></p><p><a href="#sdfootnote10anc">10</a> Paul, Kari., “Nick Clegg promoted to top Facebook role”, The Guardian, February 2022, <a href="https://www.theguardian.com/technology/2022/feb/16/nick-clegg-facebook-meta-president-global-affairs.">https://www.theguardian.com/technology/2022/feb/16/nick-clegg-facebook-meta-president-global-affairs.</a></p><p><a href="#sdfootnote10anc">10</a> Tom Simonite Gian M. Volpicelli, “Ukraine’s Digital Ministry Is a Formidable War Machine”, WIRED, March 2022, <a href="https://www.wired.com/story/ukraine-digital-ministry-war/">https://www.theguardian.com/technology/2022/feb/16/nick-clegg-facebook-meta-president-global-affairs</a>.</p><p><a href="#sdfootnote11anc">11</a> Kreiss, Daniel. Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy. Oxford Studies in Digital Politics. Oxford, New York: Oxford University Press, 2016.</p><p><a href="#sdfootnote12anc">12</a> Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. <a href="https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.">https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.</a></p><p><a href="#sdfootnote13anc">13</a> Tactical Technology. “Personal Data: Political Persuasion (How It Works),” March 2019. <a href="https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.">https://cdn.ttc.io/s/tacticaltech.org/methods_guidebook_A4_spread_web_Ed2.pdf.</a></p><p><a href="#sdfootnote14anc">14</a> Bohdanova (2020)</p><p><a href="#sdfootnote15anc">15</a> Bohdanova (2020)</p><p><a href="#sdfootnote16anc">16</a> <a href="https://twitter.com/carolecadwalla/status/1502430347832745985?t=L-aRHlsploVIj0gWvoe5xQ&amp;s=09">https://twitter.com/carolecadwalla/status/1502430347832745985?t=L-aRHlsploVIj0gWvoe5xQ&amp;s=09</a></p><p><a href="#sdfootnote17anc">17</a> Bohdanova (2020)</p><p><a href="#sdfootnote18anc">18</a> Bohdanova (2020)</p><p><a href="#sdfootnote19anc">19</a> Amber Macintyre. “The Influence Industry Long List: The Business of Your Data and Your Vote”, Tactical Tech,</p><p><a href="#sdfootnote20anc">20</a> Amber Macintyre. “The Imports and Exports of Sub-Saharan Africa’s Influence Industry” Sept 2020. April 2021. <a href="https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.">https://medium.com/@Info_Activism/the-imports-and-exports-of-sub-saharan-africas-influence-industry-d189a7bb9edf.</a></p><p><a href="#sdfootnote21anc">21</a> Max Fisher, “Disinformation for Hire, a Shadow Industry, Is Quietly Booming”, The New York Times, July 2021, <a href="https://www.nytimes.com/2021/07/25/world/europe/disinformation-social-media.html.">https://www.nytimes.com/2021/07/25/world/europe/disinformation-social-media.html.</a></p><p><a href="#sdfootnote22anc">22</a> International IDEA. (n.d.) Political Finance: Design Tool, <a href="https://www.idea.int/political-finance-design-tool.">https://www.idea.int/political-finance-design-tool.</a></p><p><a href="#sdfootnote23anc">23</a> Bohdanova (2020)</p><p><a href="#sdfootnote24anc">24</a> Bohdanova (2020)</p><p><a href="#sdfootnote25anc">25</a> <a href="https://twitter.com/annargrs/status/1497670943434366976">https://twitter.com/annargrs/status/1497670943434366976</a></p><p><a href="#sdfootnote26anc">26</a> Bohdanova (2020)</p><p><a href="#sdfootnote27anc">27</a> <a href="https://www.accessnow.org/cms/assets/uploads/2022/03/Ukraine_-Safety-tips-for-civil-society_2022-UA.pdf">https://www.accessnow.org/cms/assets/uploads/2022/03/Ukraine_-Safety-tips-for-civil-society_2022-UA.pdf</a></p><p><a href="#sdfootnote28anc">28</a> <a href="https://www.politico.eu/article/ukraine-russia-disinformation-propaganda/">https://www.politico.eu/article/ukraine-russia-disinformation-propaganda/</a></p><p><a href="#sdfootnote29anc">29</a> <a href="https://twitter.com/uamemesforces/status/1502265977488261133?t=w7GOONmY4Iey8yd5XmzR4g&amp;s=09">https://twitter.com/uamemesforces/status/1502265977488261133?t=w7GOONmY4Iey8yd5XmzR4g&amp;s=09</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f0ede624f7a9" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Time for Big Tech to Grow Up]]></title>
            <link>https://medium.com/swlh/time-for-big-tech-to-grow-up-ac0c1c7955f7?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/ac0c1c7955f7</guid>
            <category><![CDATA[instagram]]></category>
            <category><![CDATA[mental-health]]></category>
            <category><![CDATA[big-tech]]></category>
            <category><![CDATA[youth]]></category>
            <category><![CDATA[facebook]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Wed, 13 Oct 2021 14:36:36 GMT</pubDate>
            <atom:updated>2021-10-18T08:42:27.827Z</atom:updated>
            <content:encoded><![CDATA[<p><em>by Stephanie Hankey</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QBpLneABDNr2R-1rQQqH0A.png" /><figcaption><em>Illustration by: Sk1iddy, age 13</em></figcaption></figure><p>One of the defining moments of the <a href="https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-haugen-congress">Facebook whistleblower congressional hearing</a> was when Congressman Richard Blumenthal read out a text message from a member of his constituency in Connecticut. Blumenthal was visibly emotional as he shared the story of a father whose 14-year-old daughter had gone into a negative spiral on Instagram and developed anorexia. The text message from the father ended “I fear she will never be the same”. The hearing fell silent.</p><p>Three years ago, another 14-year-old girl in the UK, Molly Russell, killed herself. Her father claimed her death was ‘<a href="https://www.bbc.com/news/av/uk-46966009">helped by Instagram</a>’ and the way the platform’s algorithms pushed graphic self-harm and suicidal material. In the aftermath of his daughter’s suicide, he appealed to the British Government and to social media companies to recognise and respond to the negative influence of social media on young people.</p><blockquote>“The physical isolation of pandemic-related lockdowns has only increased young people’s dependency on social media, amplifying problems of mental health and online harms”</blockquote><p>In the years between these two fathers telling the stories of their daughters, the physical isolation of pandemic-related lockdowns has only increased young people’s dependency on social media, amplifying problems of mental health and online harms. At the same time, recent internal leaks from Facebook have revealed that <a href="https://www.npr.org/2021/10/05/1043194385/whistleblowers-testimony-facebook-instagram">13.5% of teen girls still say Instagram makes thoughts of suicide worse</a> and 17% of teen girls say the same for eating disorders.</p><p>Parents, siblings, grandparents and educators are watching on as the ‘tweens’ and teens in their lives struggle through a complex world of harmful content at the same time as trying to figure out who they are and find their way in the world. Balancing these extremes, whilst trying to give young people the space and freedom to grow independently, is the stuff of many parents’ nightmares.</p><p>The reason why former Facebook employee Frances Haugen’s whistleblowing could be game-changing is that it proves Facebook knows that its algorithms are disproportionately harmful, not only to young people but also to society and democracy, yet in Haugen’s words, ‘<a href="https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-misinformation-public-60-minutes-2021-10-03/">over and over again, [Facebook] has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety</a>.’ What makes the problem unique for younger generations is that Facebook has an agenda to grow as a business, which means staying popular in a competitive market. Future generations are one of their safest market bets, if (and it’s a big if) they can continue attracting young users. In 2012 when Facebook acquired Instagram (for a relatively modest $1 billion dollars), this solved some of their growth problems, but there are always new threats, such as the increasingly popular video sharing platform TikTok, or the chat app Discord.</p><blockquote>“Unfortunately for young users, the content that drives high engagement is often harmful, provoking and tailored to their deepest vulnerabilities”</blockquote><p>The solution to market dominance is simple: make Instagram into a place where it’s hard to look away and it’s easy to stay. Unfortunately for young users, the content that drives high engagement is often harmful, provoking and tailored to their deepest vulnerabilities. Haugen showed evidence that confirmed what many had suspected: the algorithm Facebook uses to serve content turns ‘engagement’ (what you look at, for how long and how often) into profits through advertising, even if this engagement is harmful. It is this ‘engagement-driven logic’ that amplifies negative spirals and creates content ‘rabbit warrens’ (the more you look, the deeper you go). As she explained in a <a href="https://www.youtube.com/watch?v=_Lx5VmAdZSI">recent interview</a>: ‘What’s super tragic is Facebook’s own research says as these young women begin to consume this eating disorder content, they get more and more depressed and it actually makes them use the app more and so they end up in this feedback cycle where they hate their bodies more and more.’ Haugen showed that this is not only known by the company through its own research, but that it creates a conflict of interest that Facebook has disavowed. If they fix the problem, they will make significantly less money. They have chosen not to: a decision made by a 1 trillion dollar company that Haugen frames as ‘<a href="https://www.wsj.com/video/whistleblower-says-facebooks-choices-are-disastrous-for-children-democracy/38F49BD4-614F-4BAB-8D6D-C845063C239D.html">disastrous for children and for democracy</a>’.</p><p>Digital technologies, from social media to computer games, have become central to the way young people learn, connect, grow and explore their identities. Indeed, these technologies also have benefits: they can help some young people avoid isolation, seek support with mental health challenges or escape unhealthy home environments. But the idea that these benefits outshine the ills, or that we can leave it up to young people to find a different path through a universe of media algorithmically trained to seek them out and pull them in, ignores the insidious nature of the problem. An overly protective response is wrong: taking technology away from young people is not going to make the problems vanish. Instead, we need to find ways to preserve and grow the digital environment that young people treasure while making it safe, inclusive and nurturing. Recent infrastructure failures such as <a href="https://www.nytimes.com/2021/10/04/technology/facebook-down.html">the blackout that left Facebook and other products such as Instagram and Messenger offline for over 5 hours</a> also raise important questions about what it means to have such centralised power, knowledge and data.</p><p>As societies, we have to start talking about technologies as both problem-solving and problem-making. There is a desperate need to hold some of the wealthiest companies in the world to account for an environment that no amount of educated teachers, attentive parents or even the most disciplined and savvy ‘tweens’ can fix. In standing up to Big Tech, there is no need to start from scratch. Whilst the response to the death of Molly Russell has been slow, it has given researchers and advocates in the UK, such as Baroness Beeban Kidron of 5Rights and Sonia Livingstone of LSE, the chance to push for changes in the form of the <a href="https://5rightsfoundation.com/our-work/design-of-service/age-appropriate-design-code.html">Age Appropriate Design Code</a> and the long awaited <a href="https://www.gov.uk/government/news/landmark-laws-to-keep-children-safe-stop-racial-hate-and-protect-democracy-online-published">Online Safety Bill</a>, on which Haugen will advise in the coming weeks. The mood is also changing in China, with TikTok announcing a new ‘<a href="https://www.theguardian.com/technology/2021/aug/12/tiktok-acts-on-teen-safety-with-bedtime-block-on-app-alerts">bed time</a>’ feature for 16 and 17 year olds and making changes to their direct messaging features for younger users. Similarly, Tencent has recently moved to curb computer game addiction amongst its younger users, <a href="https://www.aljazeera.com/economy/2021/8/30/china-forbids-minors-from-gaming-more-than-3-hours-per-week">restricting computer games for under 18s to the weekends</a>, albeit facing criticism for using facial recognition technology to enforce the age-code. With all eyes on Facebook, it has <a href="https://www.bbc.com/news/technology-58707753">paused the development of ‘Instagram Kids’ for 10–12 year olds</a>, in what it claims is an effort to listen more to concerned policy makers and parents in the wake of the recent revelations.</p><p>As a non-profit working internationally on these issues with young people, we at <a href="https://tacticaltech.org/#/">Tactical Tech</a> have seen these issues play out first-hand. Our new youth project aims precisely to find out ‘<a href="https://medium.com/swlh/what-the-future-wants-91f7388e0b94">What the Future Wants</a>’, while resources such as <a href="https://www.datadetoxkit.org/en/families/datadetox-x-youth/">Data Detox x Youth</a>, an interactive toolkit for 11- to 16-year-olds, help put young people in control of technologies including social media. We see that young people depend on technology, yet are frustrated with the lack of care that technology companies take for their well-being and mental health. They are not passive users but instead increasingly aware of the problems these technologies create for them, their friends and their younger siblings. The next generation is technology-dependent but also technology-critical. They have led some of the first <a href="https://www.theguardian.com/commentisfree/2020/aug/19/ditch-the-algorithm-generation-students-a-levels-politics">demonstrations around the world against carelessly devised algorithms</a>. Listening to them is important — but being accountable to them is also essential.</p><blockquote>“The next generation is technology-dependent but also technology-critical”</blockquote><p>Facebook has abused its position of trust. It has been dancing around the problems it has created in the information and democracy space since Brexit and Trump’s election. The <a href="https://datadetoxkit.org/en/misinformation/healthhoax/">misinformation it has allowed to propagate around the pandemic</a> has turned up the heat on its practices. Despite the seemingly disparate nature of these topics — youth, <a href="https://ourdataourselves.tacticaltech.org/posts/digital-listening/">elections</a> and healthcare — their root cause is the same. They stem from the attention-based and amplifying nature of the platform, as outlined in the documentary <a href="https://www.thesocialdilemma.com/">The Social Dilemma</a>, and the unethical and astronomical profits these logics produce. Evidence presented by Haugen of the knowingly negative impact these practices have on young people’s lives creates not only the potential for a breakthrough but also a gateway issue that unites users and regulators across political divides.</p><p>Non-profits, educators and parents spend time and effort trying to teach young people how to navigate the digital world. Now it is time to acknowledge: it’s Big Tech that needs to grow up. The moment of tech-enthusiasm has passed and Mark Zuckerberg’s apologies have been stretched over too many acts of neglect. Haugen has given regulators the evidence they need for real change. At the very least, we owe it to the youngest members of society to act.</p><p><em>Stephanie Hankey is the Executive Director and co-founder of the Berlin-based non-profit </em><a href="https://tacticaltech.org/#/"><em>Tactical Tech</em></a><em>, the co-curator of </em><a href="https://theglassroom.org/"><em>The Glass Room</em></a><em> and a </em><a href="https://loebfellowship.gsd.harvard.edu/fellows-alumni/current-fellows/"><em>Loeb Fellow</em></a><em> at the Graduate School of Design, Harvard University. Thanks to Michael Uwemedimo, Daisy Kidd and Sasha Ockenden for comments and additions.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ac0c1c7955f7" width="1" height="1" alt=""><hr><p><a href="https://medium.com/swlh/time-for-big-tech-to-grow-up-ac0c1c7955f7">Time for Big Tech to Grow Up</a> was originally published in <a href="https://medium.com/swlh">The Startup</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Detox de Datos Latinx — Campaigning with Characters in Latin America & the Caribbean]]></title>
            <link>https://medium.com/@Info_Activism/detox-de-datos-latinx-campaigning-with-characters-in-latin-america-the-caribbean-b26a8a8c59ce?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/b26a8a8c59ce</guid>
            <category><![CDATA[latin-america]]></category>
            <category><![CDATA[data]]></category>
            <category><![CDATA[latinx-in-tech]]></category>
            <category><![CDATA[technology]]></category>
            <category><![CDATA[youth]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Tue, 24 Aug 2021 10:44:39 GMT</pubDate>
            <atom:updated>2021-08-24T10:44:39.371Z</atom:updated>
            <content:encoded><![CDATA[<h3>Detox de Datos Latinx — Campaigning with Characters in Latin America &amp; the Caribbean</h3><p><em>by Daisy Kidd</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7D0JDPQPeQjdLvzSr4ptwQ.jpeg" /></figure><p><strong>Detox de Datos Latinx is a social media campaign and project that aims to provide a platform for young people in Latin America and the Caribbean (LAC) to educate their peers about data and technology. Led by </strong><a href="https://jaaklac.org/"><strong>JAAKLAC Iniciativa</strong></a><strong>, youth representatives from </strong><a href="https://jaaklac.org/blog/causas-digitales-2020/"><strong>Causas Digitales</strong></a><strong>, and in collaboration with young designers, schools, youth groups and other organisations in the region, this campaign reconceptualised Tactical Tech’s </strong><a href="https://datadetoxkit.org/en/families/datadetox-x-youth/"><strong>Data Detox x Youth</strong></a><strong> into a social media campaign for youth in the region.</strong></p><p>Through weekly meetings and workshops, the campaign team and the creative team worked together to think about how the Data Detox x Youth — an interactive workbook to educate young people of high school age about Online Privacy, Digital Security, Digital Wellbeing and Misinformation — could be adapted for young people in the region, asking questions such as: <em>How can digital formats be used as a way to reach young people online?</em> <em>How can a social media campaign be culturally relevant? And how can these topics reach the broader Latinx community?</em></p><h3><strong>Developing the campaign</strong></h3><p>The group began by selecting the main tips from the Data Detox x Youth privacy section and adapting the format to be shared as an Instagram carousel. In the second version, the group discussed the possibility of making the visuals and language more appealing to the audience group with brighter colours and common expressions from Latin American Spanish language.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*sgbXu__HCZQrqRuqsZTUcQ.png" /><figcaption>The first creative idea for the campaign</figcaption></figure><p>The character illustrations that were used in this second version sparked interest in the young participants, who then decided to develop the idea of characterisation further by coming up with their own characters and stories, answering questions such as: <em>what is your character’s background, interests, and how do they relate to the Data Detox?</em></p><p>The resulting characters for Detox de Datos Latinx were Matias, Anto, Berta and Iktan. These characters were based on the young participants’ own experiences and interests and through the characters, they were able to put some of themselves into the campaign.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*KAhNQCkVfW-BOKET7dXLyw.png" /></figure><p>One of the young representatives from Causas Digitales, David Aragort, came up with the character Anto, a Venezuelan girl who wants to improve the social and political reality in her country.</p><blockquote>“With my character I tried to make visible different situations of injustice that are experienced in my country, but that are also common in other countries in the region, such as homophobia, repression and persecution for political reasons carried out through state security agencies. In addition, I wanted to create a character that captures the essence of the youth of my country, which aims to transform our social reality by getting involved in public affairs and promoting inclusion and respect for those who think differently.”</blockquote><p>A young participant from Mexico, Azeneth Guadalupe García Méndez, helped to develop the character of Iktan, who teaches people misinformation martial arts. Iktan is a Mexican boy who wants to be a scientist but worries about the amount of fake news circulating on the internet.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*p6zQtvSNtcfQruZ6WqYP1g.jpeg" /></figure><blockquote>“Developing Iktan was a very fun process that took us through different paths, but something that was always clear was that we wanted to show Mexican culture. Iktan is the result of the context of those of us involved in his creation, because like him, we also enjoy hanging out with our friends and sharing part of our lives on social networks. He not only represents the Latin American youth, but also an aspiring scientific community, which seeks to combat misinformation today, because with the crises we are experiencing and among the seas of information and data on the Internet, it is easier to fall into misinformation.”</blockquote><p>The junior designer Agustina Nunes, from WILD FI, worked with these characters to come up with the social media campaign materials, which were launched mid-April, with each character taking a different day of the campaign and focusing on a different topic from the Data Detox x Youth: <a href="https://jaaklac.org/es/detoxlatinx_matias/">Digital Privacy</a>, <a href="https://jaaklac.org/es/detoxlatinx_anto">Digital Security</a>, <a href="https://jaaklac.org/es/detoxlatinx_berta/">Digital Wellbeing</a> and <a href="https://jaaklac.org/es/detoxlatinx_iktan/">Digital Misinformation</a>. The messages encouraged the audience to stay tuned for each character’s stories and recommendations and to share their own tips.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*TMsbszD_U1vT-XPxepkNAg.jpeg" /></figure><p>During the campaign, the youth group who developed the characters administered the JAAKLAC and Causas Digitales social media accounts. This was aimed at enabling different formats of participation through the project and including a diversity of voices in the campaign’s communications. Although the text publications were edited for coherence, these were mostly authored by the participating youth. It was agreed that the Spanish language used should resemble one of each character’s nationalities.</p><p>The day dedicated to digital security, and the character of Anto, David decided to make her <a href="https://twitter.com/_EmpowerRangers/status/1382364744699555848">interact with the Empower Rangers</a>, a group of characters created by Redes Ayuda. This allowed them to communicate about human rights and digital security issues in a humorous tone more appealing to an audience that is not necessarily as familiar with and interested in these issues. Both characters interacted through the JAAKLAC Twitter account.</p><p><em>“I was very interested (in the project) because I realized that in my environment we almost don’t have a culture of privacy and security in the digital environment, that’s why this project was important to me; besides it would give me the opportunity to learn and share ideas with different people from Latin America, joining efforts to show some recommendations that can make the internet a safer and healthier place for our interaction.”</em> (Azeneth Guadalupe García Méndez — Mexico)</p><h3>What’s next?</h3><p>The overall feedback about the project’s activities and campaign was highly positive. Participants valued the “good vibe environment”, “fluid conversations” and “collaborations amongst everyone”. In terms of the campaign itself, the use of characters was identified as a highlight. As participants argued, this approach allowed “people to identify with them (the characters) by having very Latin American stories” while connecting these to the Data Detox Kit.</p><p>The group agreed on the opportunities to develop the current storylines and new characters. They are hoping to narrow down on the characteristics of specific segments (LGBTQ+, environmentalists, scientists) and elaborate novel ones, such as for the BLM movement and indigenous rights activists. For this purpose, special workshops could be designed for youth and organisations to create Data Detox characters tailoring the stories, language and recommendations to its own communities/audiences.</p><blockquote>“I think a next step in the project could be to aim for a more audiovisual content with which people can interact more actively. For example, creating an app in which people can create their own character, share their story and also learn about the stories of others.”</blockquote><p>(David Aragort — Venezuela)</p><p>If you are interested in participating in the project’s future activities please send us a message to <a href="mailto:jallalla@jaaklac.org">jallalla@jaaklac.org</a></p><p>— — -</p><p><a href="https://jaaklac.org/">JAAKLAC</a> is an initiative that aims to increase critical digital education among Latin American and Caribbean communities, especially among young people in these regions. For this project JAAKLAC coordinated with Azeneth and Luatany from Causas Digitales. Together they convened the campaign’s “creation team” consisting of other young people, CSO and the creative agency WILD FI which representatives participated in weekly instances between March and April.</p><ul><li><strong>Creation</strong>: youth group composed of Agustina Nunes (WILD FI), David Aragort, Eneyda Luatany Reyes Hernández, Patrick Yurandi Martínez, Azeneth Guadalupe García Méndez and Mónica Moran Padilla; and the organisations <a href="https://conexioneducativa.org/site/">Conexión Educativa</a>, <a href="https://www.instagram.com/institutoesefossevoce/">Instituto E se Fosse Você?</a> and <a href="http://paraguayeduca.org/es/">Paraguay Educa</a>.</li><li><strong>Workshops</strong>: Schools Unidad Educativa Eugenio Espejo and Unidad Educativa Buenaventura from Ecuador and the CSO <a href="https://twitter.com/GizHonduras">Red de jóvenes de la Biosfera Cacique Lempira</a> from Honduras.</li><li><strong>Dissemination</strong>: <a href="https://calyxinstitute.org/">Calyx Institute from the United States</a>, <a href="https://www.facebook.com/J%C3%B3venes-por-el-Cambio-JXC-Guatemala-1631514786869333/">Jóvenes por el Cambio — JXC Guatemala</a>, <a href="https://www.heforshesonora.org/">Megan Yaeli A. Monaño — HeForShe Sonora from Mexico</a>, <a href="https://www.instagram.com/redconcausa2030/">Red ConCausa de Latinoamérica</a>, <a href="https://redglobal.edu.uy/">Red Global de Aprendizajes de Uruguay</a>, <a href="https://linktr.ee/jovenesafropanama">Red de Jóvenes Afropanameños</a>, <a href="https://www.facebook.com/roddnaec/">Red de Organizaciones por la Defensa de Derechos de la Niñez y Adolescencia de Ecuador</a> and the <a href="https://linktr.ee/oajnu">Organización Argentina de Jóvenes para las Naciones Unidas (OAJNU)</a>.</li></ul><p>Read more about the campaign <a href="https://jaaklac.org/detoxlatinx/">here</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b26a8a8c59ce" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[The National Resistance Movement App and Digital Politics in Uganda]]></title>
            <link>https://medium.com/swlh/the-national-resistance-movement-app-and-digital-politics-in-uganda-558df35b7b48?source=rss-f41206e28794------2</link>
            <guid isPermaLink="false">https://medium.com/p/558df35b7b48</guid>
            <category><![CDATA[campaign]]></category>
            <category><![CDATA[data]]></category>
            <category><![CDATA[uganda]]></category>
            <category><![CDATA[apps]]></category>
            <category><![CDATA[politics]]></category>
            <dc:creator><![CDATA[Tactical Tech]]></dc:creator>
            <pubDate>Fri, 30 Apr 2021 15:00:29 GMT</pubDate>
            <atom:updated>2021-04-30T15:00:29.074Z</atom:updated>
            <content:encoded><![CDATA[<p><a href="https://medium.com/swlh/elections-theres-an-app-for-that-4ddc82e0f0c8"><em>Part 1: Elections — There’s An App for That</em></a><em><br></em><a href="https://medium.com/@Info_Activism/why-investigate-election-apps-e67b6adddf75"><em>Part 2: Why Investigate Election Apps?</em></a><em><br></em><a href="https://medium.com/@Info_Activism/campaign-apps-ghana-2020-37ce62544228"><em>Part 3: Campaign Apps Ghana 2020</em></a><em><br></em><strong><em>Part 4: The National Resistance Movement App and Digital Politics in Uganda</em></strong></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/640/1*LtYMvf1aEVeej9aGfe-3JQ.gif" /><figcaption>Screenshots from the Ugandan National Resistance Movement’s iOS app, explored in the text below, with added distortion. <a href="https://apps.apple.com/us/app/nrm-app/id1502670616">Source</a></figcaption></figure><p><strong>Information and Communications Technology in Uganda</strong></p><p>Days before Uganda’s January 2021 election, President Yoweri Kaguta Museveni posted an animated video of himself joining the #Jerusalemachallenge in response to reported requests from young voters. The challenge was a viral social media trend in which people from Buenos Aires to Odessa posted videos of themselves dancing to the South African hit song. The President’s video garnered nearly 20,000 reactions on <a href="https://www.facebook.com/KagutaMuseveni/videos/jerusalem-challenge-accepted/1373662822976666/">Facebook</a> with another 3,400 likes on <a href="https://twitter.com/KagutaMuseveni/status/1348571504297046019?s=20">Twitter</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*4WzscHzaK855sdSePwXCKw.png" /><figcaption>Ugandan President Yoweri Museveni participating in the #Jerusalemachallenge with this animation, a lighthearted social media clip that generated buzz among young voters. Many claim, however, that posts like this one reflect Museveni’s disregard for the political struggles in the East African country. <a href="https://www.facebook.com/KagutaMuseveni/videos/1373662822976666">Source</a></figcaption></figure><p>Critics assert that the politician’s playful video belies the grave reality of Uganda’s political situation. Uganda’s 2021 election campaign was the most violent election in the country’s history. Two months prior to Museveni posting this clip, dozens of individuals were killed and over 500 arrested when police arrested Bobi Wine, a musician-turned-politician and Museveni’s most prominent presidential opponent, prompting a clash between Wine’s supporters and the police. Police claimed Wine’s campaign had violated COVID-19 measures; Wine’s team insists the government was simply suppressing their campaign. Two months prior, the Ugandan military classified red berets — the signature token of Wine’s People Power <br>movement — as military attire, banning it from the streets, from political gatherings, and from public life altogether.</p><blockquote>Over the years, Museveni’s National Resistance Movement party has used the new tools afforded by the internet to maintain its political power.</blockquote><p>Since its independence from British colonial rule in 1962, Uganda has never had a peaceful transfer of power. In the 1970s and 1980s, Museveni led uprisings against dictators before assuming office in 1986. Since that time, Uganda has amended its Constitution twice to allow Museveni to remain in power as he aged. Today, thirty-five years later, he’s one of the world’s longest-running heads of state. With <a href="https://www.youtube.com/watch?v=dgFYYhVIZpc">two-thirds</a> of Ugandan voters under the age of thirty and <a href="https://themediaonline.co.za/2020/01/politics-and-fashion-the-red-beret-putting-the-brand-in-firebrand/">nearly 80%</a> under thirty-five, an entire generation of young Ugandans has lived with Museveni in power, and winning their support — even through frivolous videos — is crucial for anyone seeking office.</p><p>The East African country of 44 million has witnessed changes during Museveni’s presidency, but major problems persist. In the years following his rise, Museveni tackled the AIDS crisis across the country and led Uganda through a phase of economic recovery. Growth has slowed in recent years as government officials persecute its critics and operate with impunity, establishing “the most influential <a href="https://www.justsecurity.org/74597/ugandas-museveni-secured-his-sixth-term-in-office-what-the-international-community-can-do-now/">authoritarian model</a> in the region.” In the run up to the 2021 presidential election, law enforcement attacked journalists covering the race so brutally that several were <a href="https://cpj.org/2021/01/police-beat-detain-journalists-covering-opposition-candidates-ahead-of-uganda-elections/">hospitalized</a>, including one who allegedly sustained a fractured skull. American and European civic groups planning election monitoring efforts across the country <a href="https://www.reuters.com/article/uk-uganda-election/u-s-cancels-its-observation-of-ugandas-presidential-election-idUSKBN29I1AV">abandoned</a> their plans in response to intensified hostility from the state. Twenty-six election observers from the civil society organization The Africa Elections Watch overseeing the contest were <a href="https://africaelectionswatch.org/news/2021/01/15/uganda-uganda-authorities-must-release-26-civil-society-members-arrested-for-observing-elections/">arrested</a>.</p><p>Over the years, Museveni’s National Resistance Movement party has used the new tools afforded by the internet to maintain its political power. After the arrival of the Pakistan-based Warid Telekom in Uganda in 2008, mobile phone communications grew dramatically. In December 2009, 9.38 million Ugandans accessed the internet via mobile devices. Just twenty-four months later the number nearly doubled. From 2010 to 2014, mobile broadband subscriptions grew 70% year-on-year. With an increasingly connected electorate at home, Museveni gained the means by which to monitor and reach voters.<strong> </strong>Activists and critics assert that Museveni’s government has exploited the expansion of the telecommunications industry to surveil mobile networks — and intercept voters’ communications and track their movements — under the justification of national security as authorized by a collection of laws: the Anti-Terrorism Act (2002), the Regulation of Interception of Communication Act (2010), the Computer Misuse Act (2011), the Electronic Signatures Act (2011), the Electronic Transactions Act (2011), the Anti-Pornography Act (2014), the Communications Act (2013 but later amended in 2017), and the Data Protection and Privacy Act (2019). These legislative interventions are considered a regression of Ugandans’ digital rights.</p><blockquote>With an increasingly connected electorate at home, Museveni gained the means by which to monitor and reach voters.</blockquote><p>Before the <strong>February 2011 presidential election</strong>, Ugandan voters received pre-recorded robo-calls from the Museveni campaign reminding them to vote for him. Days before the election, the Uganda Communications Commission (UCC) furnished SMS providers with a list of keywords and phrases believed to instigate unrest. Texts containing these terms were to be blocked, which the UCC justified under election integrity. Two months after the election, opposition presidential candidates who lost to Museveni spearheaded the ‘Walk to Work’ protest to call attention to the escalating cost of living resulting from inflation and fuel costs that had risen <a href="http://content.time.com/time/world/article/0,8599,2067136,00.html">50%</a> in four months. At the height of the protest, the UCC shut down access to social media across the country for 24 hours, undermining the demonstration’s mobilization efforts.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*fNd99V6PXGTD85M8.png" /><figcaption>The Ugandan Communications Commission censored SMSs containing these terms pictured above in the days preceding the 2011 federal elections, contending that they posed a threat to election security. The banned terms — which span English and Nyole, a language native to eastern Uganda — include “people power”, “teargas”, “police”, and “<em>emundu” </em>(gun).</figcaption></figure><p>After the 2011 elections and before the 2016 elections, Ugandan authorities — acting on orders issued by Museveni himself — started using <strong>intrusion malware to surveil people suspected of supporting Museveni’s opposition, including civil society actors and journalists</strong>. The central malware of the operation enabled government agents to access passwords, files, microphones and cameras of targets without their knowledge, and in some cases, people with connections to the government’s targets were bribed to infect targets’ devices. Some hotels in Uganda cooperated in the scheme, allowing agents to install fake WiFi access points at places where formal business negotiations and official meetings between heads of state were held. As the NGO Privacy International <a href="https://privacyinternational.org/report/1019/god-and-my-president-state-surveillance-uganda">notes</a> in its report published four months before the 2016 elections, “The Ugandan Government is also currently in advanced stages of procuring a communications monitoring centre, five years after its Parliament passed the Regulation of Interception of Communications Act.” Civic engagement in the 2016 elections was higher than normal because the 2016 presidential race was the first to include a televised presidential debate. As the <strong>2016 elections</strong> neared, Museveni sent unsolicited mass texts to subscribers of Airtel Uganda, a telecoms company, via a third-party service. Recipients wishing to unsubscribe had to <a href="https://www.unwantedwitness.org/uproar-over-unsolicited-museveni-campaign-sms/">pay</a> UGX 220 (0.05 EUR) to do so. On the day of the election, Museveni instituted a ban on social media on mobile devices, <a href="https://www.cnn.com/2016/02/18/world/uganda-election-social-media-shutdown">calling</a> the change a “security measure to avert lies … [lies] intended to incite violence and illegal declaration of election results.” Critics maintain that the clampdown prevented voters from documenting and sharing instances of voting irregularities. Museveni won the election with a reported 60% of the vote.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*tA6q__eXB9HVs0Dy.png" /><figcaption>A screenshot of an unsolicited message from President Museveni to a voter in the days before the 2016 election. Source: <a href="https://observer.ug/news-headlines/39174-uproar-over-unsolicited-museveni-campaign-sms">The Observer</a></figcaption></figure><p>Years later, the pattern of meddling with digital technology or political gain has continued. In 2018, the Ugandan government — partly, it seems, out of recognition of the democratizing force of social media — instituted a social media tax. The tax imposed a UGX 220 (0.05 EUR) cost per day on a bundle of 60 apps, including Facebook, Twitter, WhatsApp, Instagram, Skype, and Yahoo Messenger. The tax cost about 20% of what Ugandans pay for their phone plans, leading many price-sensitive Ugandans to cease their internet usage altogether. At the time Museveni <a href="https://www.news.uct.ac.za/article/-2019-12-18-politics-and-fashion-the-rise-of-the-red-beret">wrote</a> on his blog, “Social media use is definitely a luxury item…Internet use can sometimes be used for education purposes and research. This should not be taxed. However, using internet to access social media for chatting, recreation, malice, subversion, inciting murder, is definitely a luxury.”</p><p>Three days before the most <strong>recent election on January 14, 2021</strong>, the Atlantic Council’s Digital Forensic Research Lab <a href="https://medium.com/dfrlab/social-media-disinformation-campaign-targets-ugandan-presidential-election-b259dbbb1aa8">revealed</a> <strong>networks of fake, pro-government accounts on Facebook and Twitter apparently linked to Uganda’s Ministry of Information and Communications Technology</strong>. Both companies removed these accounts, prompting pro-government actors online to campaign with the hashtag #StopTechcolonisation. Museveni claimed that Facebook was <a href="https://www.nytimes.com/2021/01/13/world/africa/uganda-facebook-ban-elections.html">taking sides</a>, and that it would not be permitted to operate in Uganda. This time, the internet shutdown lasted <a href="https://www.reuters.com/article/us-uganda-internet-rights-trfn-idUSKBN29P1V8">100 hours</a>, costing Ugandans — who rely heavily on mobile apps to transact with one another — an estimated <a href="https://www.reuters.com/article/us-uganda-internet-rights-trfn-idUSKBN29P1V8">9 billion USD</a>. The shutdown also hurt opposition groups, who increasingly resorted to social media as government crackdowns on traditional media and journalists undermined press freedom. Museveni was ultimately declared the election’s winner.</p><p><strong>National Resistance Movement App</strong></p><p>One month before the 2021 election, Museveni released a YouTube video entitled “ICT sector #Uganda.” In it, Museveni’s voiceover proclaims over stirring background music, “The new technology should help all people produce products and services according to the principles of comparative advantage.” On-screen text boasts the extent of e-government services before closing with a bold vision about the promise of the Nation Resistance Movement’s (NRM) e-services.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*l0dWs97UXxvKUpNM.png" /><figcaption>A screenshot from a video Ugandan president Yoweri Museveni uploaded to YouTube reveals his party’s ambitions for e-services across the country. The video also stated that “Over 106 e-services can be accessed through the e-services portal.” Source: <a href="https://www.youtube.com/watch?v=OGQgUhxadcs">YouTube</a></figcaption></figure><p>Given the recent elections in Uganda, Museveni’s past handling of digital technologies, and the promises of his NRM campaign video, <strong>Tactical Tech’s Data &amp; Politics teamed up with The App Analyst to explore how Museveni’s National Resistance Movement (NRM) approached the personal data of Ugandans in practice</strong>. Campaign and party apps around the world are growing in popularity but are still unregulated and unchecked. Understanding how political groups’ apps work in practice allows us to spot differences between politicians’ grand visions and the realities of what their technologies do, as researchers and activists have done in India, the US, UK, and beyond. What might an analysis of the NRM’s official app reveal?</p><p>NRM’s <a href="https://play.google.com/store/apps/details?id=com.afrosoft.nrm&amp;hl=en_US&amp;gl=US">Android app</a> has over 1,000 installs and, as of the time of writing, was last updated in February 2021, more than one month after the elections took place. On Android, the app requests permission to access users’ precise GPS coordinates, photos, media files, camera (to take photos or videos), and microphone (to record audio). The app has overwhelmingly positive reviews, with the vast majority of reviewers rating it five stars.</p><p><strong>The </strong><a href="https://apps.apple.com/us/app/nrm-app/id1502670616"><strong>iOS app</strong></a>, on the other hand, was of particular interest to us because — although Android usage in Uganda eclipses iOS usage — the app’s developer, Jaguza Tech Uganda Limited, <strong>provided no privacy policy</strong> for the app. (The privacy policy is a document explaining how a company uses its users’ data. Without it, users have no way of knowing, for instance, if their data is being shared with a third-party or if it’s already being used by a third-party by virtue of using the service.) The Apple App Store states that the developers must submit a privacy policy when they next update the app, which was most recently updated in November 2020. Of the eight apps Jaguza Tech has released on the iOS App Store, only three have any privacy disclosures. These eight apps include a plant health monitoring app that uses deep neural networks to classify images of leaves to detect disease and georeferences images to warn farmers of outbreaks, an animal tracking app for farmers, a gospel church app enabling users to livestream sermons, a tractor booking app, a tourism app, a tick management app, and Uganda’s Electoral Commission app.</p><p>We also grew interested in the app because its functionality was quite limited. As the app’s creators <a href="https://apps.apple.com/us/app/nrm-app/id1502670616#?platform=iphone">mention</a> on the App Store, “The NRM APP provides latest news updates from NRM, Events, Achivements [sic] and public communication.” In other words, the party uses the app to send supporters updates. Our team has <a href="https://ourdataourselves.tacticaltech.org/posts/campaign-apps/">written about</a> how campaign apps from the Dominican Republic, India, the United States, and beyond have jeopardized voters’ data. Many of these apps contained functionality more complex than that of the NRM app. Could the NRM app’s simplicity foster a secure user experience, or is the absence of a privacy policy a sign of bigger problems?</p><p>By downloading the app like any other user, we could interact with other users as mentioned in the App Store. We monitored traffic passing to and from the app using Charles Proxy, and many of the usual third-party suspects appeared: Facebook (to connect to Facebook’s social graph), Google (for in-app messaging), and Imgur, an online photo-sharing community from which several of the app’s images were sourced. The folder containing app files gave us access to a plethora of images: 55 campaign posters of NRM candidates running for parliament, a few dozen for news pieces, and a few videos. There was nothing untoward about these findings.</p><p>However, we <strong>also found</strong> <strong>another set of images in a directory called “UserImages.”</strong> The folder contained two distinct photos of the man we believe to be the app developer, perhaps uploaded to the app as a test. One of the images shows what appears to be the individual’s badge of membership for NRM SOMA, shorthand for the National Resistance Movement’s social media activists. The badge includes what seems to be an image of his face, his full name, ID number, date of issuance and expiry (five years apart, the length of time between national elections), and a physical return address located in central Uganda in case of loss.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/634/0*mewontR0OU_4OjSO.png" /><figcaption>A screenshot of Yoweri Museveni’s National Resistance Movement’s iOS app from Apple’s App Store highlights a “Zero Tolerance To Corruption” achievement. In 2020, NGO Transparency International ranked Uganda 142nd out of 180 on perceptions of public sector corruption with score of 27/100. <br> Sources: <a href="https://apps.apple.com/us/app/nrm-app/id1502670616#?platform=iphone">iOS App Store</a>, <a href="https://www.transparency.org/en/cpi/2020/index/can">Transparency International</a></figcaption></figure><p>By viewing these two images via Charles Proxy, we found that they had been uploaded to a URL belonging to Bluehost, an American web hosting service company. From this URL, <strong>we were able to freely access a site with a list of all the images users uploaded to the iOS app themselves</strong>. There were over 230 image files — predominantly selfies of young to middle-aged men, occasionally with their children and one with his German shepherd. The uploaded images were taken starting in July 2020, and new images were uploaded as recently as a few days prior to the time of writing. For some photos, the device manufacturer and model of the phone was accessible. Notably, the images were different sizes, indicating lax security practices that failed to standardize or sanitize the uploaded images.</p><p>Because the images had not been sanitized, several still contained their metadata. In fact, <strong>for a handful of images, we were able to access the geolocation information stored in their metadata.</strong> This information contained the <strong>GPS coordinates pinpointing the precise spot at which the photograph in question was taken</strong>. All of the geolocated photos mapped to spots in Uganda except for one taken in Sharjah, United Arab Emirates, on February 26, about six weeks after the election.</p><p>The NRM app failed to secure its supporters’ photos, which unscrupulous actors or foreign powers could have used however they chose. This point serves as a reminder that data subjects — to use the language of the European Union’s General Data Protection Regulation — effectively relinquish control of their personal data not only when it lands in the hands of adversarial data controllers, but even when sharing data with causes they support. When shown our findings, one Ugandan stakeholder who requested anonymity remarked, “I am not surprised by the results because the NRM is widely known as a digital authoritarian government that would rather compromise the security of its supporters at the expense of retaining the presidency.”</p><blockquote>“I am not surprised by the results because the NRM is widely known as a digital authoritarian government that would rather compromise the security of its supporters at the expense of retaining the presidency.”</blockquote><p>In attempt to fix the issues we identified, we emailed the app’s developers repeatedly over the course of months, but we received no response. In the aftermath of a recent global Facebook incident in which the names, emails, and other identifiers of 533 million users was compromised, one expert asserted that the same barriers that prevent users elsewhere valuing their personal data also exist in Uganda. Unless people understand the financial value of their data, they suggested, users will continue to undervalue their personal data. And helping people appreciate how much their personal data is worth requires continued investments in digital literacy.</p><p><strong>Personal Data &amp; Politics Beyond 2021 in Uganda</strong></p><p>This glimpse at the NRM app’s handling of personal data sets a poor precedent for the “e-education, e-security, e-government, e-health, e-extension” and other internet-based solutions Museveni’s government evidently intends to deploy, especially given the possibility that each of these services may not benefit from the same abundance of resources that the National Resistance Movement invested into its own app. If the Ugandan government proceeds to implement apps for “e-education, e-security, and e-governance,” as Museveni’s campaign video suggests, what could happen to data belonging to Ugandan citizens’ that these apps collect? Perhaps a better question might be what might happen if a rival political movement in Uganda, unable to amass power through traditional means as a result of government crackdowns, resorts to technological solutions to organize but fails to secure its users’ data properly. Given the NRM’s treatment of election monitors and journalists, how might the Ugandan government at Museveni’s behest use political opponents’ selfies or geolocation data? The mere prospect paints a chilling picture. As one specialist observed, “The NRM regime has explored every opportunity at its disposal to conduct surveillance including training security personnel abroad.”</p><p>Recent technological investments are creating new possibilities to wield technology for influence, particularly with a leader like Museveni in power. For example, <a href="https://business.eskimi.com/about-us/">Eskimi</a>, an advertising platform that arrived in Uganda in 2017, boasts a suite of “Data-Driven Ads” <a href="https://business.eskimi.com/demand-side-platform/">products</a>, enabling marketers “to utilize an increasingly broad set of near-real-time information to place relevant ads in front of relevant people.” Polling companies, like <a href="https://www.researchworldint.net/about-rwi/">Research World International</a>, <a href="https://twitter.com/ipsosuganda?lang=en">IPSOS</a>, <a href="https://afrobarometer.org/countries/uganda-0">AfroBarometer</a>, and <a href="https://www.geopoll.com/market-research-uganda-panel/">GeoPoll</a> have sprung up over the past fifteen years and evolved into a cottage industry to seize the market opportunity. One commentator requesting anonymity noted that app and website developers, including those who built the NRM app, are eager to “target quick and easy NRM cash” and money from other election-related organizations. Even without any unscrupulous actions on the part of the pollsters, opportunistic political leaders can exploit the insights afforded by mass polling and by personalized data collection efforts powering the ad tech ecosystem with highly personalized ads and more <strong>invasive digital listening campaigns</strong> than those launched in 2016. Others could even wage <a href="https://www.atlanticcouncil.org/in-depth-research-reports/operation-carthage-how-a-tunisian-company-conducted-influence-operations-in-african-presidential-elections/"><strong>disinformation-for-profit</strong></a><strong> campaigns</strong>, marrying the highly personalized capabilities of advertising with propaganda observed elsewhere on the continent.</p><p>Some of these changes are already underway. In recent years, Uganda’s capital Kampala rolled out <a href="https://www.wsj.com/articles/huawei-technicians-helped-african-governments-spy-on-political-opponents-11565793017">Huawei</a>’s smart city solutions and installed facial recognition cameras throughout the capital (and across swaths of the African continent). When asked what the international community should know about the digital-political state of affairs in Uganda, a local activist noted, <strong>“Technology is a new weapon being used to suppress political opposition, exert power and control by dictatorial regimes like Uganda. No government in the history of Uganda has ever invested so much public resources in surveillance like the NRM, overlooking vital sectors like agriculture, employment, education and health.”</strong> With several Ugandan government offices — including the Directorate of Immigration and the Ugandan Revenue Authority — inclined to integrate national IDs and facial recognition, the risk of digital technologies abetting authoritarian, undemocratic ends, even through seemingly innocuous political apps, will grow. And as the government amasses more data and digital assets on its supporters and its critics, the National Resistance Movement will be even better positioned to continue pursuing and justifying a continued practice of <em>digitalpolitik</em>.</p><p>— — —</p><p><a href="https://medium.com/swlh/elections-theres-an-app-for-that-4ddc82e0f0c8"><em>Part 1: Elections — There’s An App for That</em></a><em><br></em><a href="https://medium.com/@Info_Activism/why-investigate-election-apps-e67b6adddf75"><em>Part 2: Why Investigate Election Apps?</em></a><em><br></em><a href="https://medium.com/@Info_Activism/campaign-apps-ghana-2020-37ce62544228"><em>Part 3: Campaign Apps Ghana 2020</em></a><em><br></em><strong><em>Part 4: The National Resistance Movement App and Digital Politics in Uganda</em></strong></p><p><em>Varoon Bashyakarla is a data scientist at Tactical Tech. His work explores the datafication of politics.</em></p><p><em>Gary Wright is a researcher at Tactical Tech, examining the uses of digital technologies in politics and their impacts on society.</em></p><p><em>The App Analyst is a digital security researcher with a specialty in auditing mobile apps for privacy and security vulnerabilities. Follow The App Analyst’s work </em><a href="https://theappanalyst.com/"><em>here</em></a><em> and </em><a href="https://twitter.com/theappanalyst1"><em>here</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=558df35b7b48" width="1" height="1" alt=""><hr><p><a href="https://medium.com/swlh/the-national-resistance-movement-app-and-digital-politics-in-uganda-558df35b7b48">The National Resistance Movement App and Digital Politics in Uganda</a> was originally published in <a href="https://medium.com/swlh">The Startup</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>