These lectures may have been given a long time ago, but this really spoke to me across the decades:
But do not lose sight of the real world where your ultimate product must be used, and beware of security features so intricate, elaborate, complex, difficult, and expensive that our customers throw up their hands and keep on communicating in the clear...
As did this:
In retrospect, the demise of the obsolescent, inefficient, and insecure systems seems natural, easy, inevitable, and relatively painless. But the fact of the matter is that it is usually quite difficult to get the users to relinquish any equipment once it is solidly entrenched in their inventories — especially if it works well [...]; but even if it doesn't [...] The reluctance to junk old systems stems from a number of causes, I think. First of all, they represent a large investment; secondly, the users have developed a supporting logistic base for the systems, have trained personnel to operate and maintain it — they've used it. Finally, the introduction of a new system is a slow and difficult business requiring new budgetary and procurement action, new training, the establishment of a new logistics base, and – increasingly these days – a costly installation job [...].
And this remains excellent advice in every field:
I suggest that newcomers to the business not jump on board whichever side of [a] controversy your viscera may first direct. Rather, take the other side – whichever it is – and go through the exercise of building its defense. You are likely to be surprised at how elaborate and involuted the arguments become either way and might lead you to my personal conclusion that the best way to achieve a net gain in our [endeavor] is through compromise.
That said, I've often found it interesting to play at being a zealot to see where it leads, as long as you're careful to eventually move on; keeping what was useful, dropping what wasn't, and having a better understanding of the limitations and trade-offs than you would if you'd never pushed the limits to find out where the limits are.
In fact, I'd be rather perturbed to find someone in their 20s who wasn't over-zealous on at least some issues relating to their profession (my assumption otherwise being that they aren't really interested in what they do). And I'm always deeply suspicious of someone older who doesn't, as a matter of course, talk in terms of trade-offs (my assumption otherwise being that they haven't really learned anything but dogma).
The description in the lectures of the US military's repeated re-discoveries of the problem of compromising emissions is interesting in contrast with the global view in Ross Anderson's book "Security Engineering", the first edition of which said:
The first appearance of compromising emanations in warfare seems to date to 1914. Field telephone wires were laid to connect the troops with their headquarters, and these often ran for miles, parallel to enemy trenches that were only a few hundred yards away. These wires used a single-core insulated cable and earth return in order to halve the weight and bulk of the cable. It was soon discovered that earth leakage caused a lot of crosstalk, including messages from the enemy side. Listening posts were quickly established and protective measures were introduced, including the use of twisted-pair cable. By 1915, valve amplifiers had extended the earth-leakage listening range to 100 yards for telephony and 300 yards for Morse code. It was found that the tangle of abandoned telegraph wire in no-man’s land provided such a good communications channel, and leaked so much traffic to the Germans, that clearing it away become a task for which lives were spent. By 1916, earth-return circuits had been abolished within 3,000 yards of the front. When the United States joined the war, the techniques were passed on.
Passed on, maybe, but seemingly not really taken on board until rediscovered by Bell Labs during WWII in relation to the specific technology of their time, and then forgotten again until the cold war, and then... you get the idea.
There's a strikingly pragmatic tone to the lectures, and a clear understanding of the effects of economics and politics on their work, in addition to engineering and scientific limitations. Refreshing to see an understanding that, at any given time, there will be some problems you cannot solve, for one reason or another, and that you shouldn't then waste undue effort in those directions because there will be other problems that you can solve. A kind of alcoholics anonymous creed for security theater addicts.
I guess they don't give these lectures any more.
Speaking of Ross Anderson, if you didn't already know, it's worth mentioning that the second edition of Security Engineering came out mid-2008. I haven't read the second edition yet, but the first edition (which you can read free online by following that last link) was a must-read. "Security Engineering" is the best book on security ever, by far. This isn't just a CS classic, it's a non-fiction classic that everyone should read. The subject matter is sufficiently important, and the coverage sufficiently fundamental and wide-ranging it should be required reading in schools. If you're too tight/broke to buy the second edition, at least promise me you'll read the first edition?
Given how sparing I usually am with praise, you should have a fair idea of just how good Ross Anderson's "Security Engineering" is.
Anyway, if you like Anderson, you'll like "A History of US Communications Security" because it gives some interesting 1970s US military specifics that go well with Anderson's much broader picture and perspective. It would probably be easier to read Anderson's "Security Engineering" first, and reading that book might awaken an interest in security engineering you didn't even know you had. It certainly did for me.