I first encountered this rule in web development, but once spotted I discovered it holds true in many, many diverse areas of life.
When designing a system of rules or procedures (a computer program, laws, a business's internal policies and procedures, etc) it's always tempting to ignore or avoid edge-cases - they seem so obscure or unlikely it's tempting to decide they don't matter, and not to bother resolving or fixing any ambiguities or loopholes.
People think about systems of rules the way they think about other people - you don't have to be too precise, because it'll be clear what the intent of your words is.
However, once set up systems are administrated according to the rules which define them - while "the original designer's intent" is nebulous and open to interpretation, the letter of the law is usually quite specific, even if the eventual result of them is quite different from what the original architects intended. Nobody ever got fired for following the letter of the law, even if by doing so they did great violence to its spirit.
This can be seen in all walks of life - if you're a naive programmer developing a web application it can seem tempting to ignore security holes or undefined edge-cases. "Who will ever spot that?" you think to yourself, "nobody will bother poking around in odd corners of my application, or try firing odd url parameters into my server. I'm much better off adding Whizzy New Feature #436 to my application than tidying up some dusty old corner of the code".
This sounds perfectly reasonable to most people, but any experienced web developers will be shaking their heads about now - first off, when you write code for websites your code is exposed to the entire internet, and there's always someone out there who'll start poking it with a stick, just to see what it does.
Even worse, there are also whole swathes of entirely automated systems like web spiders, spam-bots and automated vulnerability scanners that will systematically follow every link and try every combination of URL parameters it can imagine, simply to see what will happen.
The key point to take away here is that - almost invariably - your audience will turn out to be a lot larger and more diverse than you imagine, and what might seem obscure, boring or unimportant to you might not seem the same way to all of them... and neglecting to handle these edge-cases can lead to the entire system becoming compromised.
Likewise, laws suffer from this problem - they're typically crafted using vague language, and - like any non-trivial system - typically contain numerous unspotted edge-cases and loopholes. Moreover, the equivalent of issuing a patch to an existing law once it has been passed is about as complicated, fraught and long-winded as passing the law in the first place, making it difficult, time-consuming and expensive to correct errors once a law has been passed.
Like programs, relying on obscurity to paper over these loopholes is a mug's game - when laws apply to the number of people in an entire country you're pretty much guaranteed that eventually someone will either deliberately target or just stumble upon an unhandled edge-case. When this happens the system can be gamed, and the laws fail to perform their required function.
When this happens, the results may be anything from a single individual getting away with a parking ticket to your entire society taking a turn for the worse.
Remember: if a system can be gamed, it will, so take care to eliminate all possible edge-cases, and practice defence in depth so when an unknown compromise or loop-hole is inevitably eventually discovered, the amount of the system which is affected and can then be compromised is limited.
This advice applies equally whether you're a developer writing computer code, a politician crafting new laws or a manager adjusting business processes in a company. If it's a system of rules, this design axiom applies.