Selected Readings

Futurist Programming, Paul Haeberli and Bruce Karsh (with Ron Fischer, Peter Broadwell, and Tim Wicinski) (June 15, 1991; February 3, 1994) (local copy):

We react against the heavy religious atmosphere that surrounds every aspect of computer programming. We believe it is time to be free from the constraints of the past, and celebrate a renaissance in the art of computer programming.

We find that many of today's computer systems are hopelessly wasteful and inefficient. Computer hardware has realized performance increases of a factor of more than 200 in the last 20 years, while in program design very little progress has been made at all since the invention of the subroutine. We would like to see the science of programming advance as quickly as other fields of technology.

We believe that undergraduate education spends too much time conveying dogma, instead of teaching a sound theory of program design that helps programmers create good programs. Universities should provide students with less religion, and much more practical experience in making and analyzing small, fast, useful and efficient programs.

We believe the result of the common academic approach is computer science graduates who make programs that are fat, slow, and incorrect. The present state of the art in programming discourages experimentation and formal analysis. This seems contrary to what we would expect from a "science". [...]

Computer "Science" terms exposed

The NATO Software Engineering Conferences: 1968 (local copy) & 1969 (local copy)

"Lisp: Good News, Bad News, How to Win Big" (local copy) (talk originally presented 1990, paper originally published 1991) by Richard P. Gabriel is the source of "Worse is Better", but see also the ongoing argument (with others, natch, but also himself):

"The Early History of Smalltalk", Alan Kay, (March 3, 1993) (local DjVu copy) (another local DjVu copy, expanded):

I think the enormous commercialization of personal computing has smothered much of the kind of work that used to go on in universities and research labs, by sucking the talented kids towards practical applications. With companies so risk-adverse towards doing their own HW, and the HW companies betraying no real understanding of SW, the result has been a great step backwards in most repects [sic].

A twentieth century problem is that technology has become too "easy". When it was hard to do anything, whether good or bad, enough time was taken so that the result was usually good. Now we can make things almost trivially, especially in software, but most of the designs are trivial as well. This is inverse vandalism: the making of things because you can. Couple this to even less sophisticated buyers and you have generated an exploitation marketplace similar to that set up for teenagers. A counter to this is to generate enormous dissatisfaction with one's designs, using the entire history of human art as a standard and goad. Then the trick is to decouple the dissatisfaction from self-worth --- otherwise it is either too depressing or one stops too soon with trivial results.

"A Plea for Lean Software", Niklaus Wirth (1995) (local copy):

Increased hardware power has undoubtedly been the primary incentive for vendors to tackle more complex problems, and more complex problems inevitably require more complex solutions. But it is not the inherent complexity that should concern us; it is the self-inflicted complexity. There are many problems that were solved long ago, but for the same problems we are now offered solutions wrapped in much bulkier software. [...]

Initial designs for sophisticated software applications are invariably complicated, even when developed by competent engineers. Truly good solutions emerge after iterative improvements or after redesigns that exploit new insights, and the most rewarding iterations are those that result in program simplifications. Evolutions of this kind, however, are extremely rare in current software practice --- they require time-consuming thought processes that are rarely rewarded. Instead, software inadequacies are typically corrected by quickly conceived additions that invariably result in the well-known bulk.

Time pressure is probably the foremost reason behind the emergence of bulky software. The time pressure that designers endure discourages careful planning. It also discourages improving acceptable solutions; instead, it encourages quickly conceived software additions and corrections. Time pressure gradually corrupts an engineer's standard of quality and perfection. It has a detrimental effect on people as well as products.

The fact that the vendor whose product is first on the market is generally more successful than the competitor who arrives second, although with a better design, is another detrimental contribution to the computer industry. The tendency to adopt the "first" as the de facto standard is a deplorable phenomenon, based on the same time pressure.

"The Coming Age of Calm Technology", Mark Weiser and John Seely Brown (October 5, 1996):

The important waves of technological change are those that fundamentally alter the place of technology in our lives. What matters is not technology itself, but its relationship to us.

In the past fifty years of computation there have been two great trends in this relationship: the mainframe relationship, and the PC relationship. Today the Internet is carrying us through an era of widespread distributed computing towards the relationship of ubiquitous computing, characterized by deeply imbedding computation in the world. Ubiquitous computing will require a new approach to fitting technology to our lives, an approach we call "calm technology". [...]

The third wave of computing is that of ubiquitous computing, whose cross-over point with personal computing will be around 2005-2020. The "UC" era will have lots of computers sharing each of us. Some of these computers will be the hundreds we may access in the course of a few minutes of Internet browsing. Others will be imbedded in walls, chairs, clothing, light switches, cars - in everything. UC is fundamentally characterized by the connection of things in the world with computation. This will take place at a many scales, including the microscopic. [...]

The most potentially interesting, challenging, and profound change implied by the ubiquitous computing era is a focus on calm. If computers are everywhere they better stay out of the way, and that means designing them so that the people being shared by the computers remain serene and in control. Calmness is a new challenge that UC brings to computing. When computers are used behind closed doors by experts, calmness is relevant to only a few. Computers for personal use have focused on the excitement of interaction. But when computers are all around, so that we want to compute while doing something else and have more time to be more fully human, we must radically rethink the goals, context and technology of the computer and all the other technology crowding into our lives. Calmness is a fundamental challenge for all technological design of the next fifty years. [...]

"Back to Personal Computing", Carl Sassenrath (January 20, 1997):

We live in the age of tremendous personal computing power. Our desktop systems run hundreds of times faster than the large, expensive mainframe computers of years past. Yet, what has been the end result of this unbelievable power? Are you now satisfied with the operation of your system? Does it operate and respond as you expect? [...]

Many developers defend their software by arguing: "What is the harm with a 10MB program? Don't you know that memory is cheap?" What they are really saying is: "So what if it takes some time to download. Who cares that it consumes disk space and half the RAM. Perhaps configuring it is a little too complicated. All right, it does have many useless features. But, after all, it has less than a dozen obvious bugs, and it will run at least an hour before crashing."

These developers fail to recognize the core problem: software complexity. In recent years it has become universally acceptable for software technology to be absurdly complex. Systems have grown both out of control and out of proportion to their benefits, becoming wasteful, brittle, clumsy and slow.

"Systems Software Research is Irrelevant", Rob Pike (February 21, 2000) (local copy):

Too much phenomenology: invention has been replaced by observation. Today we see papers comparing interrupt latency on Linux vs. Windows. They may be interesting, they may even be relevant, but they aren't research.

In a misguided attempt to seem scientific, there's too much measurement: performance minutiae and bad charts.

By contrast, a new language or OS can make the machine feel different, give excitement, novelty. But today that's done by a cool web site or a higher CPU clock rate or some cute little device that should be a computer but isn't.

The art is gone.

But art is not science, and that's part of the point. Systems research cannot be just science; there must be engineering, design, and art. [...]

To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ...

A huge amount of work, but if you don't honor the standards you're marginalized. [...]

Today's graduating PhDs use Unix, X, Emacs, and Tex. That's their world. It's often the only computing world they've ever used for technical work.

Twenty years ago, a student would have been exposed to a wide variety of operating systems, all with good and bad points.

New employees in our lab now bring their world with them, or expect it to be there when they arrive. That's reasonable, but there was a time when joining a new lab was a chance to explore new ways of working.

Narrowness of experience leads to narrowness of imagination. [...]

Be courageous. Try different things; experiment. Try to give a cool demo.

All of Programming in the Twenty-First Century, by James Hague, as well as a couple of earlier standalone pieces:

"Why do we need modules at all?", Joe Armstrong (May 24, 2011):

This is a brain-dump-stream-of-consciousness-thing. I've been thinking about this for a while.

I'm proposing a slightly different way of programming here
The basic idea is

"I Throw Itching Powder at Tulips", Richard P. Gabriel (October 20-24, 2014) (local copy):

None of this is new thinking. And it's not really at the heart of the matter that's engaging me. Right now I am using programming very differently from what the software engineering approaches assume and celebrate, and also from what teachers of programming prepare students for. I program to explore scientific questions. What I produce are instruments that help me peer into the unknown. I don't work on puzzles but on mysteries. I don't have customers; neither requirements; nor specifications; nor test cases (really); nor design issues of the same sorts as software engineers; there are no roadmaps; everything is a prototype and also an end-product; I don't know whether the next thing I try will work, can work, should work; and I am profoundly disappointed when my programs fail to surprise me. If traditional software engineering and agile define two points on a spectrum, what I do is as far from agile as agile is from traditional software engineering, with agile in the middle between my spot and traditional SE.

This is how I see it. [...]

Commercial software is generally not exciting software. It rarely breaks new ground; if anything is difficult its difficult because its algorithms might be hard or performance is elusive or because the right kinds of data structures are hard to pin down. Of course there are exceptions. In many cases these difficulties are hidden by frameworks, middleware, libraries, and the like.

And though what the software does might be boring, its design and construction are likely not, and there is tremendous pleasure in designing and building something of value and beauty. But rarely is the construction of commercial software a grand challenge --- sometimes it is, but not frequently. This reality makes the task of creating commercial software mostly a matter of getting the details the way the customer likes. Admirable, but not my game. Consider:

But merely extending knowledge a step further is not developing science. Breeding homing pigeons that could cover a given space with ever increasing rapidity did not give us the laws of telegraphy, nor did breeding faster horses bring us the steam locomotive.
-- Edward J. v. K. Menge

I do science.

"Adult Engineer Over-Optimization as the Motie Problem", Mark Damon Hughes (November 22, 2019):

If you think about a standard software career, there's maybe 10 years of a submissive fool badly coding crap languages [...]

Then maybe 10 years of them being project managers and "architects", running waterfall and GANTT charts; they'll say they're "agile" but then have a giant JIRA repo of "backlog" features which have to be implemented before shipping, weekly 4-hour planning "backlog grooming" meetings, and unrealistic estimates. This is sufficient to build all kinds of horrible vertical prisons of the mind [...]

Then they either retire, or are "downsized", and now what? So they work on their own code, do maintenance on old systems, or leave the industry entirely.

If they work on their own, freed of evil megacorp constraints, they're going to end up in something idiosyncratic and expressive, like Scheme, LISP, Forth, or a custom language. Make their own weirdo environment that's perfectly fit to themself, and unusable/unreadable by anyone else. [...]

All of which brought to mind The Mote in God's Eye, where the Motie Engineers over-optimize everything into a tangled mess, and the Watchmaker vermin are even worse, wiring up everything to everything to make new devices. [...]

"An oral history of Bank Python", Cal Paterson (November 2021):

One of the slightly odd things about Minerva is that a lot of it is "data-first", rather than "code-first". This is odd because the majority of software engineering is the reverse. For example, in object oriented design the aim is to organise the program around "classes", which are coherent groupings of behaviour (ie: code), the data is often simply along for the ride. Writing programs with MnTable is different: you group the data into tables and then the code lives separately. These two lenses for organising computations are at the heart of the object relational impedance mismatch which has caused such grief. The force is out of balance: many more programmers can design decent object-oriented classes than can bring a set of tables into third normal form. This is a large part of the reason that that annoying impedance mismatch keeps coming up.

The other unusual thing about Minerva is that it opts, in many cases, to have one big something rather than many small somethings. One big codebase. One big database. One big job runner. Clubbing it all together removes a lot of accidental complexity: you already have a language runtime (and the version in prod is the same as on your computer), a basic database and a place for your code to run before you even start. That means it's possible to sit down, write a script and get it running in prod within the hour, which is a big deal.

Minerva is obviously heavily influenced by the technological path dependency of the financial sector, which is another way of saying: there is a lot of MS Excel. Any new software solution is going to be compared with MS Excel and if the result is unfavourable people will often just use continue to use Excel instead. Many, many technologists have taken one look at an existing workflow of spreadsheets, reacted with performative disgust, and proposed the trifecta of microservices, Kubernetes and something called a "service mesh".

This kind of Big Enterprise technology however takes away that basic agency of those Excel users, who no longer understand the business process they run and now have to negotiate with ludicrous technology dweebs for each software change. The previous pliability of the spreadsheets has been completely lost. Using simple Python functions, in a source controlled system, is a better middle ground than the modern-day equivalent of J2EE. Financiers are able to learn Python, and while they may never be amazing at it they can contribute to a much higher level and even make their own changes and get them deployed.

The Grug Brained Developer:

grug brain developer not so smart, but grug brain developer program many long year and learn some things although mostly still confused

grug brain developer try collect learns into small, easily digestible and funny page, not only for you, the young grug, but also for him because as grug brain developer get older he forget important things, like what had for breakfast or if put pants on [...]

apex predator of grug is complexity [...]

complexity is spirit demon that enter codebase through well-meaning but ultimately very clubbable non grug-brain developers and project managers who not fear complexity spirit demon or even know about sometime

one day code base understandable and grug can get work done, everything good!

next day impossible: complexity demon spirit has entered code and very dangerous situation!

grug no able see complexity demon, but grug sense presence in code base [...]

note! very good if senior grug willing to say publicly: "hmmm, this too complex for grug"!

many developers Fear Of Looking Dumb (FOLD), grug also at one time FOLD, but grug learn get over: very important senior grug say "this too complicated and confuse to me"

this make it ok for junior grugs to admit too complex and not understand as well, often such case! FOLD major source of complexity demon power over developer, especially young grugs!

Books

Linked titles go to relevant records at Open Library unless otherwise indicated.

John Gall's books on "Systemantics":

The Soul of a New Machine, Tracy Kidder (ISBN: 0380599317 / 9780380599318)

Hackers: Heroes of the Computer Revolution, Steven Levy (ISBN: 0385191952)

Programmers at Work, Susan Lammers (ISBN: 0914845713)

The UNIX-HATERS Handbook, Simson Garfinkel, Daniel Weise, and Steven Strassmann (ISBN: 1568842031) (local copy)

Out of their Minds: The Lives and Discoveries of 15 Great Computer Scientists, Dennis Shasha and Cathy Lazere (ISBN: 0387979921). Some select, short excerpts are available on the web.

Patterns of Software, Richard P. Gabriel (ISBN: 019510269X) (local copy). Freely available from the author under a Creative Commons License after going out of print.

Halcyon Days: Interviews with Classic Computer and Video Game Programmers, James Hague (a digital book made freely available on the web)

The Transparent Society, David Brin (ISBN: 0738201448)

Dealers of Lightning, Michael A. Hiltzik (ISBN: 9780061913488, 9780061913501, 9780061913518)

The Social Life of Information, John Seely Brown and Paul Duguid (ISBN: 0875847625 / 9780875847627)

The School of Niklaus Wirth: The Art of Simplicity, edited by Laszlo Boszormenyi, Jurg Gutknecht, and Gustav Pomberger (ISBN: 1558607234 / 9781558607231)

Coders at Work: Reflections on the Craft of Programming, Peter Seibel (ISBN: 1430219483)

UNIX: A History and a Memoir, Brian W. Kernighan (ISBN: 1695978552 / 9781695978553)


2023-06-28, 2023-07-13, 2023-11-30, 2025-01-17