When a rocket launches, we usually only care about one thing: did it work? We cheer if it reaches orbit and we gasp if it doesn’t. The French philosopher Bruno Latour called this “black boxing” because when a machine is successful we stop looking at its complex inner parts, we just see an input, which is the launch, and then a satellite in orbit. The box becomes sealed by its own success. But Latour also found that a black box must crack open when things break: when a car stalls you need to open the hood. You can no longer ignore the engine and figure it out by working on the chassis.
ISRO has yet to release the Failure Analysis Committee (FAC) report for the PSLV-C61 mission. For PSLV-C62, a short statement on its website says “a detailed analysis has been initiated”; I’m not sure if this is the same as a new FAC. If the organisation is opening the “black box” only for itself, investigating failures internally while keeping the results secret from the public and independent peers, it’s falling into a trap Latour expected: that objectivity doesn’t come from a single person or a single agency looking really hard at a problem but from having as many different people as possible, with different viewpoints and biases, looking at it.
For years, NASA engineers knew that foam insulation was falling off the external fuel tank and hitting the Space Shuttle. They looked at it constantly and they analysed it. They did open the hood but they also only talked to each other, and in the process they managed to convince themselves it wasn’t a safety risk. The American sociologist Diane Vaughan called this the ‘normalisation of deviance’: when small departures from conservative practice become routine because of the idea that “nothing bad happened last time”. If they’d released those internal reports to external aerodynamicists or independent safety boards before Columbia lifted off, they likely wouldn’t have had the disaster they did.
Today ISRO risks the same ‘normalisation of deviance’: without external eyes to challenge its assumptions, its experts are at risk of convincing themselves that a recurring PS3 stage glitch is manageable — right up until it isn’t.
Latour also often spoke of the ‘parliament of things’, the idea that technologies like rockets are part of our political and social world rather than simply being technical objects. If ISRO solves the problem internally, it might fix the specific valve or sensor or whatever but it won’t fix the institutional pressure that caused the quality control to slip in the first place. Only public scrutiny, i.e. the assembly of MPs and citizens asking irritating questions like “why?”, can force an agency to fix its hardware as well as its culture.
Then we have institutional memory as well: when you fix a problem in secret you’re also withholding the lessons you’ve learnt from young engineers. Public reports are effectively a permanent, searchable archive of mistakes.
In the 1979 book Laboratory Life: The Social Construction of Scientific Facts he coauthored with the British sociologist Steve Woolgar, Latour defined an “inscription” as any visual display produced by a lab setup, no matter how large or expensive, whose final output is a piece of visual information. For instance, a bioassay might start by pipetting chemicals and shaking tubes but it ends with a sheet of paper with numbers or a jagged line on a graph. That paper is the inscription. And at this point the scientists discard the physical substances (the chemical compounds) and retain the inscription.
According to Latour, science is almost never a single ‘eureka!’ and almost always a series of inscriptions. This narrative is useful to understand that objectivity in science is often a myth: because scientists don’t just passively observe nature but are writers and craftworkers in their own right and draw on the corresponding skills to make sense of nature. A statement becomes a ‘fact’ only when the inscriptions supporting it are so clear and numerous, so that dissenting voices are silenced, and to challenge a fact you need to produce counter-inscriptions of a similar or greater calibre.
But when there’s no inscription, when the FAC reports are invisible, what do you challenge if you need to? How do you achieve progress in a rational way?
The Soviet Union’s N1 rocket was its equivalent of the USA’s Saturn V, designed to take cosmonauts to the moon. And it failed all four times it launched. An important reason was that, for all its other successes, the Soviet space programme was a sealed box. There was no independent press to ask why the rocket’s engines were exploding and no parliamentary questions about safety protocols — and inside this Matrioshka doll of secrecy its engineers were paralysed by political pressure. When data showed the rocket had a high probability of failure, managers simply massaged the numbers to please the Kremlin. And because the failures were state secrets, the collective intelligence of the scientific community was never brought to bear on the problem.
Look at NASA’s Challenger disaster in 1986 on the other hand, which was also a tragedy born of a political pressure to launch at all costs. NASA managers had ignored warnings from engineers about the Space Shuttle’s O-rings failing in cold weather; they had, as with Columbia but 17 years earlier, normalised deviance and had accepted small failures right up until they added up to a big one. After the explosion the American system forced the black box open and the Rogers Commission identified the technical fault as well as interrogated the institutional culture. And by publicly airing these concerns — including ‘letting’ Richard Feynman dip an O-ring in ice water on live TV to prove a point — NASA was humiliated, yes, but it was also saved. The scrutiny forced it to rebuild its safety protocols, recover public trust, and allow an object as complex as the Space Shuttle to return to flight, until Columbia revealed this turnaround to have been incomplete.
Because the Soviet state kept the failures of its N1 missions to the moon a secret, future Russian engineers couldn’t fully study those specific failures in open academic literature. On the other hand NASA’s failures are effectively public textbooks, with engineers in India, Europe, and China today studying its failure reports even today to avoid making the same mistakes. Likewise by hiding the PSLV-C61 report, and the PSLV-C39 FAC report and other reports of a similar nature, ISRO isn’t just hurting itself: it’s hurting the global knowledge base of rocketry. And like the Soviet Union of yore and unlike NASA in the late 1980s and the early 2000s, by shielding its findings from criticism, ISRO is ensuring its solutions are weak and at risk of failing again.
If ISRO engineers know a failure will be hushed up to protect the prime minister’s image, they may be less likely to speak up about a faulty sensor or a cracked nozzle. If people can’t ask why the PS3 stage failed the pressure to fix it is essentially replaced by the pressure to just “make it look good” for the next launch. In the end by closing itself off ISRO risks becoming a fragile institution. It treats its rockets as matters of fact — unquestionable symbols of national pride — rather than as matters of concern, complex machines that need honest and sometimes harsh public maintenance. There’s a reason transparency is one of the ingredients of good engineering.










