|


A Behind-the-Scenes Look
by John Cunningham, W1AI, Coursemaster
When a new question pool is released, the months-long work of updating our course is just beginning.
When a new question pool arrives
The National Conference of Volunteer Examiner Coordinators (NCVEC) updates the amateur radio question pools on a rotating schedule. Each license class—Technician, General, and Amateur Extra—gets a new question pool every four years. A new Technician pool will go into effect on July 1. After that, the General pool will be updated in July 2027, and the Extra pool in July 2028. No question pool will be updated in 2029. The cycle starts over with the Technician again in 2030.
The new question pools are generally made public about six months before their effective date, giving course developers time to update their materials. At HamTestOnline™, this is our busiest time of year. We treat each new question pool as an opportunity to improve our courses.
The first step is getting those new questions into our system. We have software that imports the entire pool and automatically matches each new question with its closest counterpart from the previous pool. This year, 91% of the questions matched to some degree, giving us a solid starting point. That setup is completed in less than a week, but the real work begins after that.
Over the following months, I go through the course, infotext by infotext, and question by question, updating and improving the material. In parallel, we hold a series of technical review meetings to evaluate the reworked sections. My reviewers repeat the same process I do—infotext by infotext and question by question—often finding additional ways to improve the course. As a result, the review becomes part of the rework process rather than a simple sign-off.
A logical learning order
One of the first things we look at is the order in which concepts are introduced. For example, I noticed that “solder” was referenced in sections on station and tower grounding before it had been adequately explained, so I moved the “Solder” infotext earlier in the course. Our goal is to make sure students don’t encounter unfamiliar terms without a clear explanation.

Using student performance data
One advantage of an online system is that we can collect detailed statistics on student performance. Our goal is for at least 90% of students to answer each question correctly the first time they see it after reading the associated infotext. When questions in the new pool are identical or similar to those in the previous pool, we look at how students performed on those older questions. When necessary, we revise the infotext to bring those results up to our standard.
Sometimes the problem is that an infotext is trying to cover too much at once. For example, we had an infotext titled “Short circuit / Open circuit” that defined seven different terms, and some of the related questions weren’t hitting our 90% standard. We split that infotext into two—one for short circuits and one for open circuits. The concepts were independent enough that separating them made the material easier to follow, with no downside. Our experience suggests that students do better with shorter, more focused infotexts, but we’ll confirm this as we collect statistics on the new questions.
A Teachable Moment
When a student chooses a distractor (an incorrect answer choice), it often creates a teachable moment. At that instant, the student’s brain wants to know why the answer was wrong, and we try to turn that curiosity into learning. We are always on the lookout for these moments, especially when our statistics show that too many students are choosing the same distractor.
The next challenge is deciding how best to address it. In some cases, we can add a small amount of information to the infotext so that all students benefit. Other times, if the information is too off-topic, we place it in the explanation that appears after the distractor is chosen, providing timely feedback to those who need it. There is no one-size-fits-all rule. Each case requires judgment about where the clarification will do the most good.
Using AI carefully
For the first time, we also used AI to help improve the clarity of our infotexts. In particular, ChatGPT proved to be an excellent wordsmith, often suggesting clearer ways to present the material. At the same time, it tries to introduce subtle technical errors, so we had to review every proposed change carefully. ChatGPT essentially became a brilliant but sloppy member of our team, discussing and debating each tiny detail, much like a human collaborator. Of course, the Coursemaster always has the final say!
Here’s an example of how we improved the text without letting technical errors creep in:
- We previously had, “The feed line is the wire that runs from your radio to your antenna.”
- ChatGPT suggested, “The feed line is the cable that carries radio-frequency energy from your transmitter to your antenna.” That wording is more precise, but it overlooks the fact that the feed line is a two-way street: it also carries received signals back to the radio.
- We settled on, “The feed line is the cable that carries radio-frequency energy between your radio and your antenna.”

Ready early
Our goal each cycle is to have the updated course and new questions ready by May 1—two full months before they take effect. We wrapped up the final review ahead of schedule this year. Special thanks to my technical review team, Ron and Dustin, for the time and care they devoted to this project.
The new Technician course is now live, and students with an active Technician subscription automatically have access to both the current and upcoming question pools.
Feel free to respond to this email with comments for the HamTestOnline™ team. We love feedback!
|