Designing a product, whether it’s an application, a professional software, a robot or a car, is a difficult task.
Before the 2000s and the advent of UX design, user experience was not really taken into account. Developers and engineers designed based on what made sense to them. But a coder’s logic is not the same as everyone else’s, and users had to learn to use the software at their disposal like they would learn to drive a car.
The adage at the time was that “in 90% of the situations, the problem is between the screen and the chair”. The human factor has always been decried as the cause of problems, as if a machine could not encounter problems. On the contrary, the human factor is the key to solving many problems and preventing many incidents.
The reality is that in 90% of the cases, the software was not ergonomic at all, which is why it is important to hire specialists to improve the software and promote an optimal user experience.
When you find out that what you designed is not used properly, or that users don’t understand everything they could do with it, designers tend to blame themselves.
Designers are afraid of repeating the mistakes of the 90s by designing things that make sense to them when they don’t.
But despite all the tools at their disposal: user testing, guerrilla testing, heuristic analysis, it is not possible to design a perfect product.
First of all, there are too many factors that come into play: culture, computer skills, age, browsing habits … All these elements and many more will make sure that there will always be someone who would have preferred things to be done differently.
That’s why it’s important to work with niches. By identifying specific users within a niche, it is easier to take their needs into account and find solutions that will suit almost all the intended users.
Secondly, because human behavior is too unpredictable. The situations of use and the randomness of the behaviors will always find the flaw in a system.
This problem has been illustrated in many novels, but particularly well in the Asimov novels
In the robot cycle, Asimov starts from the postulate that artificial beings are conceived with three laws imprinted in them and from which they cannot derogate:
- A robot cannot harm a human being nor, remaining passive, leave this human being exposed to danger;
- A robot must obey the orders given by human beings, unless such orders contradict the first law;
- A robot must protect its existence to the extent that such protection does not conflict with the first or second law.
No matter what the circumstances, robots will always respect these three laws (in the universe of the novels). However, on many occasions, humans working with robots find themselves in danger and must use their wits to find the cause of the malfunction.
Two engineers managing a mercury mine find themselves in a critical situation when the robot that was supposed to bring them the ore they needed to operate their suits starts to malfunction. With only a few minutes of autonomy left, one of the engineers understands where the malfunction comes from: the order to bring back the ore was given in a neutral tone, without urgency. The ore in question is in a dangerous zone for the robot. Following the two laws of robotics, the mining robot is going around the mining area without ever entering it because it did not realize that to stay there for several days would endanger the humans.
No doubt if a situation like this occurred, human error would be blamed since the robot is technically capable of doing what it was created to do. But the problem comes from the robot, and without human intervention, the robot would have continued to run in circles around its objective to respect the initial order and not to damage itself.
Blaming the human for the way he uses a tool is not to make an effort as a designer, but blaming the designers for a bad use is not to accept that humans will always misuse the tools at their disposal for reasons that we cannot guess without observing them.
The human factor is something that must be studied and not denigrated. It is responsible for errors, but above all it allows us to identify limits in the design. The human factor is also responsible for the repair of incidents, without humans to intervene, many incidents would turn into accidents.