This project was inspired by a real experience I witnessed during an online examination. A close friend of mine, Das, had to take an important exam using a weak and unstable internet connection. Because the proctoring system required continuous video streaming and strict monitoring, he faced repeated interruptions and warnings—despite genuinely attempting the exam honestly. The system treated connectivity issues as suspicious behavior. This not only affected his performance but also caused unnecessary anxiety. Watching this made me realize that many online exam platforms are not designed for low-bandwidth or real-world conditions, especially for students from rural areas or limited network environments. At the same time, these systems rely heavily on automated judgments, often without room for human understanding. This raised an important question for me: Why should students be penalized for technical limitations they cannot control? This experience motivated me to build an exam monitoring system that works reliably even on low-bandwidth networks and supports fairness through transparency and human decision-making—rather than automated accusations.
The system is an AI-assisted online exam monitoring platform designed to help institutions conduct fair and reliable remote examinations, even in low-bandwidth conditions. Instead of trying to automatically detect or accuse cheating, the system focuses on enforcing exam rules, recording factual events, and supporting teachers with transparent evidence. 🔹 Core Functions Secure Exam Access Teachers can create exams and share a secure exam link with students. Students must authenticate themselves before entering the exam. Identity Verification Face recognition is used at the start of the exam and when a student re-enters after interruptions to ensure the same candidate continues the test. Strict Exam Environment Control If a student switches tabs, minimizes the exam app, or leaves the exam environment, the test session is immediately paused or terminated according to the exam rules. The student must re-authenticate to continue. Behavioral Event Logging (Not Judgment) The system logs objective events such as: App exits or tab switches Re-login attempts Network interruptions Device-level changes These events are recorded as facts, not interpreted as cheating. AI-Assisted Pattern Support Keystroke dynamics and usage patterns are analyzed only to provide supporting context if needed. These signals never result in automatic penalties. Low-Bandwidth & Offline Support The system operates with minimal data usage and can function during temporary network failures by securely buffering events locally and syncing them later. Teacher-Centric Review Dashboard Teachers receive a clear timeline of exam events, allowing them to make informed and fair decisions after the exam.
Faced App was built with a strong focus on real-world constraints, especially low-bandwidth networks, limited devices, and the need for fairness in high-stakes exams. Instead of relying on heavy video streaming or black-box AI decisions, we designed the system around event-based monitoring and assistive intelligence. The architecture is modular: A lightweight exam interface built with HTML, CSS, and JavaScript enforces exam rules like preventing tab switching or app exits. A Python-based backend (Flask / FastAPI) handles exam creation, authentication, event logging, and teacher dashboards. AI components such as face recognition and keystroke dynamics are used selectively for identity verification and pattern support, not automated judgment. To handle unreliable connectivity, the system follows an offline-first approach, buffering encrypted events locally and syncing them once the network stabilizes. Every design choice prioritized reliability, transparency, and deployability over complexity
Low-bandwidth limitations Many proctoring systems assume stable internet. Designing features that work without continuous connectivity required rethinking how and when data is collected and transmitted. Avoiding false accusations It was challenging to ensure that AI signals did not turn into automated judgments. We had to clearly separate event logging from decision-making. Balancing control and fairness Enforcing strict rules like tab switching termination while still allowing students to rejoin the exam safely required careful rule design. Resisting feature overload It was tempting to add advanced AI features like gaze tracking or emotion detection, but we deliberately avoided them to keep the system ethical and explainable.
Built an AI-assisted exam integrity system that does not rely on continuous surveillance Designed a solution that works even in low-bandwidth and unstable network conditions Kept teachers in control by presenting clear, factual exam event timelines Implemented strict exam enforcement (like tab-switch termination) without automated punishment Created a privacy-first architecture with minimal data collection AI is most effective when it supports humans instead of replacing them Reliability matters more than flashy features in real-world systems Ethical decisions directly influence technical architecture Simpler systems are easier to explain, trust, and scale Designing for constraints leads to better engineering What’s Next for Faced App Expanding support for Android devices with stronger device-level controls Improving the teacher dashboard with clearer visual summaries of exam events Adding configurable exam rules for different institutions Conducting real pilot tests with students and educators Refining AI models using anonymized, consent-based data Exploring institutional integrations while maintaining privacy guarantees The long-term goal is to make Faced App a trusted exam integrity assistant, not an automated surveillance tool.
Built With
- and-sqlite/mongodb-with-an-offline-first
- cloud
- deployed
- design
- low-bandwidth
- on
- python-(flask/fastapi)-backend-with-html/css/javascript-frontend
- rule-based-behavioral-monitoring
- scikit-learn-for-keystroke-analysis
- using-facenet/deepface-for-identity-verification
Log in or sign up for Devpost to join the conversation.