As part of our intro for new Agents, I always make sure to point out that there is help available if they need it while on the Mission. The Help utility does not provide the answer, it’s more like a hint to get you moving. But help is not free. I tell everyone that they will lose a little of your potential points by using help (10% for each clue). Help clues are unique to each task and so they will be relevant.
But no one uses Help. It’s actually become a joke when I introduce the Operations screen and point out how easy it is to use the Help function, “It’s there if you need it. I built this whole utility, but no one ever uses it.” We’ve been up and running for a while, so I thought I would run some queries to see just how helpful Help can be.
Over the past 4 months, Agents have attempted 857 Missions. Each Agent team gets 3 tries to complete the Mission, so these attempts might reflect more than 1 for a single team. Very few missions (<10 so far) are completed on the first try. Many failures are because people don’t read the directives completely and so they guess at a solution. But I see a lot of Agents unable to focus on where their attention should lie. They clearly need a little Help.
I was very particular about the data architecture for processing and tracking Agent performance on a Mission. That enables us to drill in on the data. 404 (47%) of the 857 Missions ended in failure because of completing a task incorrectly. Another 132 (15%) failed because time ran out on the task. Because a single task will cause a failure, that means that on 536 Missions (62%) a team was presented with a task and could not complete it within the time limit. Note, there are multiple tasks per Mission. Agents attempted 5,501 tasks, so the failure rate is only ~10%.
But here’s where it gets interesting. Of those 536 failures, Agents only used the help function 39 times (7%). When they DID use help, 64% of the time (25 tasks), that led to getting the answer correct. Looking at the actual submissions, 3 of the other 14 that used help and failed the task, just typed the answer wrong, so they were close. Counting those would have brought the success rate to 72%. Of the other 11 that used help, I can see in their submission that 2 were on the right path. Interestingly, 9 Help users (23%) received no ‘help’ at all (gotta work on that).
So what can we learn from this? Using Help still has a stigma but maybe not in the way we think. Asking for Help in TheMissionZone app, isn’t like raising your hand in class or calling a staff member in an escape room. It’s literally a button on the app and a popup modal. It could not be any more private. When I ask Agents why they didn’t click the button many will say something like, “I don’t want help, I wanted to do it on my own.” (Note: Many forget it was there when their adrenaline gets going) The data seems to indicate that a high percentage of people would rather fail a challenge than suffer the blow to their pride of asking for help from a computer. Wow. The larger question: Why is this kind of help not still ‘doing it on you own’?
My background in data management enabled me some foresight to realize that tracking this information and being able to quantify discussions would be useful. There are so many data points to examine; Which tasks require help? Which cause the most failures? How long were they working the task before clicking help? How much time is left on the clock before clicking help? Why are people almost twice as likely to use help on our family friendly Level 1 Mission (23 times), vs the adult themed one (13 times)?
I am so excited for the day when employers can talk with employees about getting more comfortable asking for help. I am inspired to think about a teacher talking to a student about how it’s ok to ask for help, not just on a Mission, but in the classroom…look at how much it can bring you success!
There is so much more in the data we are collecting, I encourage researchers to reach out to me so we can discuss even more beneficial ways to derive insights and affect lives.