Japan’s first civil air raid drills were held in Osaka in 1928, but it wasn’t until March 1937 that the Diet passed an Air Defense Act establishing rules on how things should work in the event of bombing raids on the homeland. War in China was imminent, but few probably could have imagined the horror of the air raids to come the following decade.
With Japan-U.S. relations growing tense, in February 1941 the law was amended in two important respects. First, the government was empowered to prohibit civilians from seeking safety — to order them to remain in areas likely to be bombed. Second, owners and occupants of houses and buildings became legally obligated to help fight fires caused by air raids. These mandates were backed by stiff criminal penalties.
The real enforcement, however, came through the Home Ministry’s oppressive tonarigumi system of local organization and mutual supervision (remnants of which survive today in the form of chōnaikai community associations). Since wartime rationing was conducted through these organizations, compliance could be achieved through the tacit threat of further hunger and deprivation.
Protective bunkers for the people were an afterthought. Civilians were instructed to dig shallow holes in dirt under the tatami mats in their homes. These were called “staging areas” rather than “shelters,” since the idea was that people would hide below their highly flammable dwellings while the bombs fell and then promptly emerge to help extinguish any resulting fires.
To help drill home the message, catchy slogans were developed, like: “First: Control lamps and flames. Second: Protect against fire. Third: Protect against poison gas. Fourth: Smile!”
Millions of civilians were killed, maimed, orphaned or traumatized by the Allied bombing campaign. It is impossible to ascertain how many of these could have been avoided by having laws that cared more about people than buildings, but the number is probably not small.
While the postwar government was generous in compensating ex-army and naval personnel, military auxiliaries and their families for wartime injuries and deaths, civilian victims received nothing. The rationale was that soldiers and sailors were doing their duty and deserved to be looked after.
A 2008 lawsuit brought by a group of civilian victims challenged this orthodoxy on equal-protection grounds, basing their argument in part on the fact that the obligations in the Air Defense Act turned civilians from passive victims into duty-bound participants in the war effort, just like uniformed personnel. They lost, of course, but the judgments at both the district court and high court levels acknowledged the government’s air defense policies had enhanced the dangers to which civilians were exposed, findings of fact the Supreme Court let stand when it rejected the plaintiffs’ final appeal in 2014.
The Air Defense Act doubtless seemed as insane to the people who had to obey it in the past as it does to us today. Yet this probably comes from assuming the law was intended for the benefit of the people. But what if it were not? Historically, Japan’s attitude towards law has been one that sees society as being improved by wise authority figures instructing the people how to act, rather than regulating freedom.
From the standpoint of a technocrat thinking about how to minimize damage from air raids — about what people (other people) should do — it probably seemed completely rational to conscript the citizenry into the nation’s fire-fighting infrastructure. That this required civilians to act counterintuitively, — by exposing themselves and their families to the risk of harm — simply meant that the government needed to lie to them about the dangers involved. Official instructions sanguinely advised that American incendiary bombs could be doused with a bucket of water or gingerly picked up with a rag and thrown outside. Unlike today, fear of death from above needed to be diminished, not fostered.
The government needs you
Now defunct, the Air Defense Act is worth remembering — not only because Japanese are learning to duck and take cover from North Korean ballistic missiles, but because it provides a stark example of how laws can regard people as the means of implementing policy rather than their intended beneficiaries. Also, although the act was backed by harsh penalties, day-to-day implementation relied heavily on notionally voluntary behavior that was actually coerced. (Similarly, kamikaze pilots were volunteers, but the government still imposed quotas on schools who had to pressure their students to ensure an adequate supply.)
These are familiar themes in regulation today: Policies rational to the clever bureaucrats who create them are implemented through people in a way that requires them not only to act irrationally but often voluntarily. People are expected to volunteer for the Olympics because that is the right thing to do. The NHK is funded through the fiction that people have voluntarily entered into a contract with the public broadcaster, and criminal suspects “voluntarily” accompany police to the station and, without a lawyer present, “voluntarily” confess to crimes.
Such policies often have unintended results, because they tend to ignore the realities of the people who are supposed to follow them. Problem: Japanese people do not take enough time off. Solution: “Happy Monday,” a 2001 law that rescheduled a number of public holidays so they always fell on Mondays, giving people more long weekends to play with. Result: Public holidays become increasingly meaningless for people working at universities (and probably elsewhere), because it is impossible to schedule a semester’s worth of Monday classes without ignoring them.
Or take what will become increasingly apparent from now until next year: people losing their jobs despite wanting to continue working and their employers wishing to keep them. In 2018, amendments to the Labor Contract Act will take effect that give employees hired through a fixed-term contract arrangement that has lasted more than five years the option of demanding conversion to permanent status. Amendments to the Worker Dispatch Act proscribing the use of a particular dispatched worker in the same job for more than three years will also take effect.
Fostering long-term employment stability is a perfectly well-intentioned policy goal. Whether the goal can be achieved by laws such as these is questionable, since they fail to address the risks that come with permanent employees in the form of long-term fixed costs. Not only are there are no clearly articulated rules for terminating or buying out permanent employees, but the cost of having them can also increase through other well-meaning, rational government policies such as those seeking to secure employment for workers beyond the standard retirement age of 60.
Thus, even if both worker and employer are happy with the status quo, it may be perfectly rational for employers to terminate a worker rather than take them on permanently. This is already happening in academia, where adjunct professors who have been teaching the same course for years are being pushed out or converted to “contractor” status because of the looming five-year threshold. Some adjuncts might like a permanent job and perhaps a few will get them, but the end result will likely be more people simply losing one or more part-time gigs through no fault of their own.
Outside academia, companies are reportedly increasing their permanent hires, but this is more likely driven by labor shortages in general and won’t change the fact that employment relationships will end despite both sides wishing it didn’t have to.
Then there’s banking. After years of subjecting smaller regional banks to rigid oversight devoted to getting bad loans off their books, the Financial Services Agency is now grappling with the problem of the same banks being reluctant to lend. Having effectively raised an entire generation of bank managers who know only how to demand more collateral and reject loan applications, the FSA is in the odd position of trying to force banks to take more risk — unsecured loans to new ventures! — and, in the process, telling them how to do things like evaluate business plans. Coming at a time when negative interest rates are already squeezing profit margins, increased risk-taking is not a strategy all banks would rationally adopt if they weren’t badgered into doing so by regulators. Since policy shifts of this sort have to be explained by regulators in a way that continues to justify their prior policies (which might have been part of the problem), those being regulated probably quickly stop thinking about what is rational anyway: “Just do what the bureaucrats say.”
Then there is the surreal world I inhabit, Japanese law schools. Established in 2004, law schools were supposed to train a new breed of lawyers, judges and prosecutors by teaching law students what they needed to know in order to pass the bar exam — without teaching them how to pass the bar exam.
Really! By regulation, Japanese law schools are prohibited from devoting too much curricular time to bar exam subjects, and accreditation guidelines require them to steer clear of anything that smacks of the grubby exam-taking techniques that were supposedly the problem with the old system of qualifying legal professionals. At the same time, regulators formally evaluate law schools based on their bar exam performance: Those whose graduates fare poorly are bad.
Moreover, the thousands of graduates who spent years of their lives going to law school but never passed the bar exam (in part because the government kept arbitrarily reducing the number of people it allows to pass) are branded failures, though really they are just the well-educated collateral damage of policies that have affected them profoundly but were never really for their benefit. Duck and cover indeed.
Colin P.A. Jones is a professor at Doshisha Law School in Kyoto. The views expressed are those of the author alone.