Skip to content

Validate a peer draft

When a peer in your field submits a scenario, it lands in your validation queue. Your sign-off is the gate that lets the scenario move toward learners. Two field-expert sign-offs are required by default; one is allowed if your field has only one expert in the platform.

What you are signing off on

  • This situation could happen in our field. It’s plausible. A practitioner would recognise it.
  • The actor would behave that way. The role brief carries enough for the AI to play the actor in a way you would recognise as that person.
  • The trigger would matter. What’s at stake is real, not invented to make the scenario feel hard.

What you are NOT signing off on

  • Copy polish. Typos and awkward phrasing are a comment, not a reject.
  • Trainer pedagogy. Whether this scenario fits a particular cohort right now is the trainer’s call at promotion, not yours.
  • The author’s choice of weights or competency vector. Those belong to the author. You can comment if a weight choice looks wrong, but the author makes the call.

How to validate

Open the validation queue from your home. Pick a peer’s draft. Read in this order:

  1. Trigger first — does this situation belong in our field?
  2. Actor briefs next — would each actor behave that way?
  3. Phases — does the sequence fit the situation?
  4. Success criteria — is “competent” recognisable from this?

The realism preview output is shown alongside the draft. Read its flags but do not defer to it; you are the practitioner.

The three outcomes

  • Approve. The scenario is professionally true. Your sign-off records.
  • Comment. Request changes. Use this when something needs work but is not a substantive accuracy issue — a thin actor brief, a trigger that needs more specificity, a confusing phase order.
  • Reject. Use this when there is a substantive issue with field accuracy — the situation could not happen, the actor would not behave that way, the stakes as written are wrong. A reject sends the draft back to the author with your reason.

When something goes wrong

  • You’d word it differently but it’s accurate. That’s a comment, not a rejection. The author owns the words.
  • The situation could happen but is unusual. That’s fine. Learners need practice across the distribution of real situations, not just the median case.
  • You’re the only expert in the field. One sign-off advances the draft. The platform records this so trainers and admins know the scenario went through single-expert validation.
  • You disagree with another expert’s sign-off. Open a comment on the draft. Two-expert validation exists precisely to surface this kind of disagreement before learners see the scenario.