About UsResearchOpportunities
PublicationsResourcesSite Map
 
next up previous
Next: ``Environmental intelligence'' or intelligence-gathering Up: Problem statement Previous: Computer Science or Computer Secrecy

Obvious or obfuscated?

Imagine a clock that was designed so that when the cover was lifted off, all the gears would fly out in different directions, so that it would be more difficult for a young child to open up his or her parents' clock and determine how it works. Alternatively, imagine the clock were loaded with explosives, so that it would completely self-destruct upon opening.

Assuming a child survived such a bad experience, it is still doubtful that devices made in this manner would be good for society, in particular, for the growth and development of young engineers and scientists with a natural curiosity about the world around them.

As the boundary between software and hardware blurs, devices are becoming more and more difficult to understand. This difficulty arises in part, as a result of deliberate obfuscation on the part of product manufacturers. More and more devices contain general-purpose microprocessors, so that their function depends on software. Specificity of function is achieved through specificity of software rather than specificity of physical form. By manufacturing everyday devices in which there is provided only executable code, without source code, manufacturers have provided a first level of obfuscation. Furthermore, additional obfuscation tools are often used in order to make the executable task image more difficult to understand. These tools include strippers that remove object link names, etc., and even tools for building encrypted executables which contain dynamic decryption function that generates a narrow sliding window of unencrypted executable, so that only a small fragment of the executable is decrypted at any given time. In this way, not only is the end-user deprived of source-code, but the executable code itself is encrypted, making it difficult or impossible to look at the code even at the machine code level.

Moreover, devices such as Complex Programmable Logic Devices (CPLDs), such as the Alterra 7000 series, often have provisions to permanently destroy the data and address lines leading into a device, so that a single chip device can operate as a finite-state machine yet conceal even its machine-level contents from examination. (An excellent tutorial on FPGAs and CPLDs may be found in[3].) Devices such as Clipper chips go a step further by incorporating fluorine atoms, so that if the user attempts to put the device into a milling machine, to mill off layer-by-layer for examination under an electron microscope, the device will self-destruct in a drastic manner that destroys structure. Thus the Clipper phones could contain a ``trojan horse'', or some other kind of ``back door'', and we might never be able to determine whether or not this is the case. This is yet another example of deliberate obfuscation of the operational principles of everyday things.

Thus we have a growing number of general-purpose devices whose function or purpose depends on software, downloaded code or microcode. Because this code is intellectually encrypted, so is the purpose and function of the device. In this way, manufacturers may provide us with a stated function or purpose, but the actual function or purpose may differ, or may include extra features, of which we are not aware.


next up previous
Next: ``Environmental intelligence'' or intelligence-gathering Up: Problem statement Previous: Computer Science or Computer Secrecy
Steve Mann
1998-09-15