From Insight to Impact

Testing the Agora Platform

Testing Acceleration Services with the Agora Platform: A Design Thinking Journey

Testing a new acceleration service involves more than just a technical check: it’s part of a structured process grounded in Design Thinking principles. Following the seven key phases outlined in our methodology, testing plays a crucial role in ensuring the continuous improvement and evolution of acceleration services.

Rather than being a standalone step, testing is integrated throughout the Design Thinking process. It provides continuous feedback, making the process interactive and adaptive. Specifically, our testing strategy aims to validate the Minimum Viable Solution (MVS) of the Agora platform and generate valuable insights for scaling it up to a full viable solution (VS).

Why Different Types of Testing?

Testing is not “one size fits all.” Each type of test serves a specific purpose and addresses a different aspect of service development. Services must not only function, but also prove their usefulness, ease of use, and capacity to generate value in the long run. For this reason, our testing activities have multiple layers:

  1. Initial Validation of Problem–Solution Fit
    Are the acceleration services solving the right problems for users?
  2. Market Validation
    Do the services align with broader market needs and user expectations?
  3. Usability and Platform Design Validation
    Can users efficiently navigate the Agora platform?
    • Usability directly impacts adoption and engagement.
    • We assess whether the platform is intuitive, accessible, and free of unnecessary obstacles.
  4. Assessment of Institutional Transformation
    How do acceleration services help higher education institutions (HEIs) implement their transformation agendas?

Who We Test With

A crucial element of the testing process is the active involvement of users. The Agora platform was conceived as a co-created space, and so it is only natural that co-creators of services available on Agora, along with current and prospective users, are central to the testing phases. Participants include administrative and technical staff, academic staff, and students, drawn from both University Alliances and individual Higher Education Institutions.

By engaging such a diverse group, we can capture perspectives from those who design and manage services, those who deliver education and research, and those who experience these services as part of their professional or learning journey. Their feedback provides us with a multi-faceted view of how Agora performs and how we can improve it.

Why We Work in Cycles

Testing is organized into three distinct cycles. This design reflects the idea that innovation is iterative: each cycle generates insights that feed into the next, allowing us to track progress, refine solutions, and identify new challenges along the way.

The first cycle has involved members of the Unite! and EPiCUR University Alliances, helping us to validate the initial service offer and test its usability. The second cycle, which is just starting, brings in additional European Alliances already familiar with the Agora platform. Finally, the third cycle will not only introduce new users but also re-engage earlier participants, so we can test improvements and ensure the platform remains relevant and impactful.

This cycle-based approach does more than improve individual services: it creates a continuous feedback loop that strengthens the whole ecosystem. Regular updates and evaluations flow directly into the aUPaEU technical team responsible for moving Agora from MVS to VS, ensuring that decisions are data-driven and closely aligned with stakeholder needs.

Early Results and User Feedback

Even at this early stage, user feedback has highlighted several key strengths of the Agora platform.

A recurring theme is efficiency. By automating tasks that were previously handled manually, Agora allows staff to save time and focus on higher-value activities. This improvement is particularly visible for co-creators and administrative staff, who have noticed smoother workflows and fewer operational bottlenecks.

Another important aspect is the centralization of resources. Instead of dealing with fragmented tools and scattered information, users now have a single point of access. This makes it easier to navigate services, discover opportunities, and connect with the right resources at the right time.

Equally appreciated is the interactive nature of the platform. Unlike static repositories, Agora is designed for engagement: users can connect directly, explore clickable catalogues, and move seamlessly from browsing to action.

Finally, Agora has been recognized as a facilitator for collaboration and connectivity. Researchers, in particular, see it as a gateway for building international networks, discovering peers with shared interests, and developing new projects across European Universities.

Looking Ahead

The first feedback loops confirm that Agora is not just a digital platform, but a springboard for transformation. It saves time, reduces fragmentation, and fosters collaboration in ways that directly benefit Higher Education Institutions and their communities.

As we continue testing, our goal is to refine these strengths and address emerging challenges. The journey is iterative and adaptive, but the direction is clear: empowering institutions and individuals to innovate, connect, and grow together.

Collaboration at the Core
How to Support Research & Innovation​