Teens in AI Albania: Hackathon Scoring App
A purpose-built hackathon platform - team formation, submissions, two-phase jury scoring, live rankings - ran 200 participants and 36 teams in a day.

Key results
- 200Participants
- 36Teams
- 12Evaluators
Challenge
A hackathon with 200 participants, 36 teams, and a 5-person jury running across a single day is a logistics challenge that most off-the-shelf tools are not built for. Teams need to find each other, submit projects, and update work in real time. Jury members need to score every team against a structured set of criteria - across two evaluation phases - without chasing spreadsheets or waiting for an organizer to compile results. And at the end of the day, the rankings need to be live, defensible, and exportable.
Teens in AI Albania 2026 needed a purpose-built platform that could handle the full event lifecycle: from team formation to final results, in one place, on the day.
Solution
We built a dedicated hackathon tracking application on Replit covering every role in the event - participants, jury, and administrators.
Participants
Participants could search by name to find their team, submit project links across multiple formats (presentation, demo, repository, video), and update submissions as their work evolved during the day. The submission flow was intentionally simple - a participant with no technical background could complete it in under two minutes.
Jury members
Jury members had a dedicated evaluation interface showing all teams and their submissions. Scoring was structured across 10 weighted criteria spanning two phases:
- Phase 1: Innovation, Feasibility, Technical Execution, Social Impact, Teamwork
- Phase 2: Impact and Alignment with UN SDGs (weight 15%), Innovation and Creativity (10%), Technical Execution (25%), Ethics (5%)
Each criterion was scored out of 10 and weighted automatically, with final scores computed in real time.
Administrators
Administrators had full event phase control - opening and locking submissions, triggering Phase 1 and Phase 2 scoring, promoting the top 10 finalists, and publishing final results. Rankings were live, recomputable, and exportable to CSV at any point. The entire admin interface was built for a non-technical organizer to run without support.
Outcome
- 200 participants managed across 36 teams in a single platform
- 35 of 36 teams scored across 10 weighted criteria in two evaluation phases
- Top 10 finalists identified and promoted automatically
- Full event lifecycle - submissions, scoring, rankings, export - completed in one day with zero manual coordination
- Live leaderboard available to organizers and jury throughout the event
Key Learnings
- Multi-role design is the hardest decision. Getting the boundary between participant, jury, and admin right at the start saves hours of confusion on the day. Retrofitting access controls mid-event is painful and erodes trust with participants.
- Real-time scoring changes the energy of an event. When jury members can see rankings update live and organizers can recompute at any moment, the result feels earned - not compiled overnight.
- Vibe coding made it possible. A tool of this complexity - multi-role, multi-phase, real-time scoring with weighted criteria - would have taken weeks to build conventionally. We built it ahead of the event with AI-assisted development, iterated based on organizer feedback, and ran it live on the day without issues.
- Events like this are where NEKOD's governance philosophy is most visible: you do not lock people out, you give them exactly what they need for their role, with the right controls in the right places.
Services delivered
Gallery





