Words by DEBIKA RAY
Human error is often blamed for security breaches, but natural human “deviation” is essential to making systems work, argues Kartikeya Tripathi, a lecturer in security and crime science at University College London.
Tripathi has undertaken research that focuses on decision-making by frontline staff in transport environments, such as metro systems in major cities in Europe and Asia. While safety is a well-established concern, security is a more recent addition — with the heightened risk of terrorist attacks on transport systems, employees are increasingly relied upon to perform security tasks that were not traditionally part of their role.
"Sometimes deviation is necessary. If front-line staff were to apply procedures in textbook form, a lot might not work and could just create more risk"Kartikeya Tripathi, University College London
Security models typically require people to follow standard operating procedures imposed from the top, but Tripathi has found that they quite often don’t. This shouldn’t necessarily be regarded as a problem, he says: “Sometimes deviation is necessary. A transport network is a complex environment where staff are trying to meet multiple goals. Security is just one of them and often an add-on to their primary role. If front-line staff were to apply procedures in textbook form, a lot might not work and could just create more risk.”
Studies had already shown that transport staff may cut corners on safety in order to meet punctuality targets. By interviewing drivers and analyzing simulations on a metro system in a large Asian city, Tripathi found that not only did this apply to security too, but also that safety and security — perceived as “eternal killjoys” — could themselves be contradictory. For example, if a passenger reported a suspicious item to the driver, they were supposed to make an announcement and continue at maximum speed to the next stop. But the participating drivers feared that an announcement would cause panic and a stampede at the next station, while speeding up could make it harder to brake. Many tended to prioritize safety over security, since the chance of a threat turning out to be real was so low.
Not everything can be easily classified as a security or safety task, he adds, so procedures need to consider mutual dependencies and interactions: “A big element is using experience to judge the extent to which you should modify the procedures given to you.”
A similar situation arises all the time among other professions managing complex risks, he points out — surgeons must continually respond to what’s actually happening on the operating table, while pilots may have to change course or land suddenly in the interests of safety. The most effective security procedures leave scope for people to react to events in real time and make their own judgements, rather than badging every deviation as an error: “When you know that people will deviate, you can design more realistic response procedures.”