More Analog Computing Is on the Way
More Analog Computing Is on the Way
Programming an analog computer seems a lot more like electrical engineering than coding. But with the return of analog computing in a hybrid analog/digital context, progress is being made on methodologies which allow us to "write" analog programs versus "wire" them.
Join the DZone community and get the full member experience.Join For Free
A while ago I wrote an article Analog Computers Return: Finally and gave a quick overview about some of the pros and cons. I also gave some examples of how fundamentally different the analog approaches are from the digital methods of today. Some readers came away from that article with the impression that programming and analog computer is more like bread boarding an electronic circuit than it is like typing in program code. Certainly, at its most primitive level this is certainly true. Changing the scale weight on an input parameter for an analog computer program usually amounts to selecting a resistor value to represent the scale.
But digital computing at its early stages was much more like electrical engineering too. Some early, commercial grade, digital computers (in the early 1960s) were programmed by physically inserting jumper wires on the backside of a cookie sheet sized patch panel. You would write a program by making jumper connections between individual logic modules inside the computer. You would load your program by physically plugging your (as many as a thousand pins!) pin patch panel into a receptacle on top of the CPU cabinet. And there were physical risks: 1. Accidentally bending one of the pins, 2. Straining your back because the insertion force for those giant multi-pin connectors was quite large.
Somewhat later we used toggle switches (a row of physical switches representing the bits in the computer word) to set each bit in a word and then pressing the load button to store it in computer memory and advance the memory address by one. This activity also seemed a lot more like controlling an instrument than programming a computer.
This exact model computer was used for speech recognition research at MIT in the late 1960s. In order to start this computer the operator had to load a 100 instruction paper tape driver program, by hand, one word at a time. Then the tape reader read a paper tape containing the magnetic tape driver program. Finally, the primitive operating system was loaded from magnetic tape.
Analog computers of the same era were also programmed by patching wires between basic functional modules. But in the case of analog computers, it was amplifiers, capacitors, and resistors.
Digital computers have advanced quite far from the early days of connecting individual gates and loading individual words by hand with wires and switches. The first step of evolving from wiring to typing was simply a small set of instructions that mimicked the connections made by jumpers. These early machine code languages literally meant things like "connect output to register 1, write data". From that very explicit formalism we advanced to more sophisticated assembly languages. And with the appearance of FORTRAN and COBOL we were able to manipulate data and operations in a completely logical fashion. We had a compiler that abstracted and insulated us from any knowledge of where our data or instructions were being physically manipulated. Today, of course, we have abstractions on top of abstractions (abstractions all the way down?). In fact, I would bet that a very large majority of today's developers have only the vaguest notion of how the actual physical work of computing is accomplished.
With our renewed interest in analog computers, we are following the same path. There's a great deal of power to be had with analog techniques, but for many of the same reasons that we no longer wire are digital programs we also do not want to wire our analog programs. Aside from all the problems of jumper cables and the requirement that the programmer needs substantial electrical engineering knowledge, the biggest drawback to analog computing has been that you can't copy programs very easily. Essentially every program is physically handcrafted on the computer. We need the equivalent of an abstract text-based language that can be compiled into a set of instructions which guide the analog computer on how to make/map the physical connections. And of course, the analog computer needs to rewire itself using electronic, solid-state techniques.
The door on this new generation of analog computer programming is definitely open. Last month, at the Association for Computing Machinery’s (ACM) conference on Programming Language Design and Implementation, a paper was presented that described a compiler that uses a text based, high-level, abstraction language to generate the necessary low-level circuit wiring that defines the physical analog computing implementation. This research was done at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Dartmouth College. The main focus of their investigation was to improve the simulation of biological systems.
One of the first analog computer programs I wrote in my professional career was a simulation of pulmonary ventilation/diffusion/perfusion. It was "written" on an Applied Dynamics computer very similar to this model.
The researchers tested their compiler on several sets of differential equations (remember, differential equations are the best way to describe dynamic systems and are also what analog computers are best at). Some of the smaller sets of four or five equations took about a minute to generate the wiring map for the computation. For the most complicated example involving 75 related differential equations the compilation took almost an hour. Even in the compiler's primordial, early stages this is far faster than any human could do. The pulmonary program I mentioned above used less than 20 differential equations and took me at least two weeks before I had a workable program. Analog compiler technology is just in its infancy so it seems quite likely that dramatic advances in speed, complexity, and robustness will be coming soon.
As Adrian Sampson, an assistant professor of computer science at Cornell University said “‘Digital’ is almost synonymous with ‘computer’ today, but that’s actually kind of a shame. Everybody knows that analog hardware can be incredibly efficient — if we could use it productively. This paper is the most promising compiler work I can remember that could let mere mortals program analog computers. The clever thing they did is to target a kind of problem where analog computing is already known to be a good match — biological simulations — and build a compiler specialized for that case. I hope Sara, Rahul, and Martin keep pushing in this direction, to bring the untapped efficiency potential of analog components to more kinds of computing.”
No doubt, there is a resurgence in analog computing today. Here are some other things to check out:
Opinions expressed by DZone contributors are their own.