last update: 17/ may / 2003

Zen and personal computers stability art

[This article try to answer to the frequent questions: "why did my PC hang so frequently?" and "why programmers are unable to write correctly a software?".



- Introduction -

This article take its origins from a joke: one day a colleague, tired to have her's computer hanging so frequently, send to me an e-mail with a gag about computers and computers sellers. I don't want to write about this gag because I don't want to debate about the copyrights with my colleague :-) but I report to You the ending part:
" Metaphisic paradox invite to think about the machine conditionament limits that umans can accept. Machines were, historically, instruments to free us from hard work, but it seems that actualy machines been conscious of our tecnological dependence and force us to work to made them better in a kink of orrible revenge."

- Tecnological disease -

latest phrase take me back to an Alan Coper book untitled "Technological disease". In this book the writer puts in evidence (with a lot of examples) the frustration of a tipical PC user when he/she attempt to costrain PC's to their work. Cooper, as many athers ask for another effort by programmers in the direction to lower this users disease by increasing the programs user interface. In extreme conclusions they apologize for the elimination of "interface". Cooper allow to charge all informatic world disease to a badly user interface. IMHO this is only half the way to the truth and a little confirmation to thath thinking comes from fact that Open Source world is very rich of examples of very stable programs with a hard or non-existent user interface.

By the way all this ideas should be developed so I start to analize user interface (UI) and PC stability problems. More I went deep in the study more was points to face. Contest is wide and it's necessary to take care of man-machine elaboration model differences, of programming languages and analisys tools inadequacy, and who know how many others parameters.

It wasn't in my intenction to go deep in the argument that was resembling to me as deep jungla for the bug number of arguments and informations I was exploring.So I closed all in a corner of my mind and started to write notes to the point I decide to write this article.

-Man - machine differences - models -

Every PC user by instinct and inconsciously trend to assign thinking capabilities to this kind of machines. In real world PC is a very stupid stuff and it's thinking capabilities are, at least, a far mirror image of programmer thinks that programmer itself put in program while defining it's logic. From here is possible to give a fist explain to anomalies that is possible to find in some situations: make mistakes is uman, programmers are umans. So programmers make mistakes and programs contain result of this errors.

As said computer is stupid and programs are operating instruction that programmers writes for computers. They should supply a reaction for every variable that future operators will input. In a standard program, in fact, operator will interoperate by mean of parameters and values that will modify program flow at run time. It's easy to understand that We'll have more coerent results as more programmer made a correct analisys of all possible operational variables/parameters of program itself.
And more, is necessary, (witout taking care of system complexity degree) to face that computer programs are not directly applied to reality but to real case models that can't be completly close to reality itself. Many times a such degree of similitude is not required because is more simple and useful to operate in a subsystem of real possible situations to reduce the number of elements to take care during programming phase or at run time.

So programs are used to generate and rappresent models and in some cases the number of variables and parameters are so high that You'll never be able to identify all real case situations. As logic conseguence if input situation (at run time) falls in one case that was not correctly analized at programming time the result may go in wrong direction or in some cases can take all system to an instability situation because correct program flowing is already compromised. More frequent case is loop that is that for this run time situation program is running in a loop of instructions and executes them without varying those variables that may solve situation and take program out of loop itself. If there is not a visual feedback to the operator the loop cant be detected and the visible effect will be this of a hanged PC. If to that situationblocked_os (15K) is possible to add a bad operative system, where a loop can hang whole system and requires to reboot PC, You can understand frustration proved from some users day by day (with sometimes the lost of data). In this case, also, Open Source world show us a lot of valid solutions thanks to availability of a number of *NIX-like systems that for their building philosophy runs programs in separated tasks so is some near impossible to hang whole system (I don't want to debate more to this point so I semplified it). It' s also ecessary to take care of another point that is the adaptation of human brain versus calculator. Two are elements to face: 1st for PC thinking paths are defined from program flow where a human can react by itself to new elements. Some one can open a flame to this point saying that human education is a kind of program that can avoid correct reactions of human brain to some situations. this kind or argumentation validate the compare of human and calculator reactions and take more in evidence the ability of the first to adapt to new situations.
2nd element is given from the fact that a program can elaborate only logic and binary data that falls in architettural limits of machine while a for man every data in analogic and the same value can be moved from its origin by mean of a "convenience" factor that not only comes from the situation but also from some previous experiences. To make it simple is I "say" to calculator that I have 1Kg of sugar (in the contest of a cooking program) it can answer me that it's enaugh to made my cake that requires 1Kg of sugar. A human can accept also the value of 990g reducing other ingredients of accepting a cake with less sugar in it. I'm sure that PC will discard this value and will not allow to me to go on cookin. This example is simple but can be compared to thousands of similar situations where we found illogic that computer can't adapt to situation.

- Programming languages inadequacy -

In this analisys We can left to examine programming languages inadequacy as possible source of tecnological disease. That rogramming languages are inadequate in not an opinion of mine but only the conclusion of following analisys.
Let's analize initial use of computers. At the beginning they were machines enabled to do mathematical calculations. 1st generation of computers didn't had a real software and the program was hardcoded in circuits. Programming computers was to modify their electrical circuits and change wires paths. Those operations was reserved to same people who build these machines. Calculations were basics and answers were interpreted by reading some light on front panel.
The programming language "was" the machine a the user interface was in any mind.

Some year later things come better and comes a kind of switch programming and (more later) with punchcards, punchcardpoi, with answers printed on paper (must be clear that I'm explaining a complete evolution with some words).From programming viewpoint tecnicians able to understand hardware was already necessary but it was not necessary to rebuild computer for every program we need. User interface was surely better as a printed answer was a big step in future.

With microprocessor a drastical change come and programming languages seen the light. But not immediatly! Everyone who in '70 tryed to program a "personal computer" can say that programming was to input binaries in memory cells through some switches, results come in LEDs and memory pointer where moved by a switch. It's necessary to say that in the same period the O.S. UNIX was already in big calculators but the personal computers situation was an image of the one in those first microprocessors big machines. They were based on John Von Neumann studies. He designed a computer based on one CPU, a slow access memory (an hard disk ancient) and a secondary fast access memory that was like RAM. This machine stored instructions as binary values and executes them sequentially. Still nowaday calculators reflects this basic model in theyr structure.
Bynary code is directly related to his processor and is called machine language. It is the lowest and more direct level of programming and requires a full understand of hardware. First level of abstraction is assembly language that associate each CPU instruction with a mnemonic code that is more easy to understand for programmers and makes easier to program. Assembly is first level of user interface used to make easier to work on computers that, infact, is able to understand only machine code and , for that every compiler or interpeter, at the end, will change the program in machine code instructions. Abstraction introduced from programming languages are needed only to makes programmers work easyer through a sintax easy to remember or a more easy definition of problem and relative solving algorithm.
The first high level programming language was FORTRAN (FORmula TRANslator) that allows to solve easily mathematical type computations. Second was COBOL (COmmon Business Oriented Language) that, as complement, covered commercial computations that in FORTRAN was hard to program easily.

First programming languages pointed more to solve problem than to manage users interfaces and their sintax reflect this status. Output was as simple as possible: a variable request or an elaboration result. Programs were sequence of instructions with a conditioned flow. Analisys instruments of the period was at same level and flowcharting was one of best tools used to follow program logic.

Let us left for a while those thinking on programming languages abstraction and let's point to user interface. From what You just read You can understand that the need to interact with user is born, relatively, late in computing history. Machines, languages and analisys tools, at the origin, devolved more efforts on how to let machine operate and minimally on interfacing with user. This may generate a flame because it is also true that ergonomy end man-machine interfaces researches are relatively old and the results of researches coming, in the '70, from Xerox Park Labs are part of informatic history. But what I want to point out is that massive diffusion of user-interface concepts or the care of end user needs has wait till end of '80s to see an application.
Closing this parenthesys it's necessary to note as programming languages has, in the time, evolved to an new level of abstraction to the actual level of object model that point focus to problem and methods involved causing a total abstraction from HW where not also from writing software. The limit of this situation is given from fact that nowaday an excess of abstraction gives programs too big (fatware) and not optimized that trends to underuse the processors. After that, more complex structures of software on software are created and an error in a lower level can create problems to upper levels and drive the system to instability .

A question coming from those people new to programming is "Why are there so many programming languages?". This question is not easy to answer and sometimes watching to P.L. is possible to see that many of them are different just a little bit in sintax and a little more in application field.
The answer can be reconduced to the abstraction needs we point before and is possible to say that: "Any programming language can be the best to solve all problems!". Facing only three of basic requirements that a P.L. has to have it should be the best for:

  1. -a determinated application or application field
  2. -a determinated system
  3. -a determinated programmers group
To reach this the P.L. need to be the better for every requirement to create the better code or the easyest use. A such language, for definition, may not be the best for a high number of application or application fields or for a big number of systems so every programming language will try to position itself somewhere from "the better for that use" to "general-purpose".
It is also useful to say that a very specialistic language has an high learning-curve that left out generic programmers and they trends to make difficult to recicle experience.
It's easy to understand that presence of so many languages makes difficult for a programmer to chose to follow only one without the risk of beeing sooner or later out of work market. This situation carry in itself some more difficulties because a programmer should follow the innovation of a number of languages and need to update to new versions everytime. As logic conseguence some programs written in a P.L. that was not the one preferred from programmer are easyest to errors. Same results can be if the language is the preferred one but not the most adequate to solve the problem.

A parameter that sometimes is not keep in correct care when searching sources of systems problems is software degrade. Software has, obviously, so part that is subject of wear, but in the time it's quality degrade. Comparing life-cicle of industrial components with software one it's possible to see that components show a peak of problems (and related costs) when they are new and that's for prototypes problems. Then follow a phase of stability up to maximum of consumption and wear.
Software life-cicle show a starting peak due to initial bugs or project inadequacy and a continuous decrease tanks debugging then there are some progressive peaks caused from new releases. From every modification the curve don't goues back to starting stability but falls to a new upper stable level that's worst than previous stable level. That indicates that software decrease its quality ad every modification till it requires a complete revision or substitution.
A solution to this problems seems to be software recycling that has found a maximum espression in software components.
The complete recycle of sw components has not completly realized and, at now, many programs are written from scratch. This is generated overall from models and situations unicity: as more system is complex and specialized so far is possible to reuse sw components and code.

Another kind of wear is what I define "enviromental" pollution and went up at run time. I use this definition to indicate that situation that follows some sw installation in a stable (hw+sw) system so that it's degraded from added software. A bad situation can come from sostitution of dinamic libraries with new versions not correctly in line with rest of O.S.; another case can be the reduction of free space on HD after new sw install. This problem is easy to solve but show how PC diffusion can carry some problems in it demanding sw install to non-technical people. Best way to avoid "enviromental" pollution is to deserve a PC for single usage and when it reaches a good stability level don't alter it.

- Analisys tools inadequacy -

As We seen before We had in time a progressive abstraction of programming languages: at the beginning to overcome hardware complexity then to separate the problem from language and at the ent to abstract objects from problem to well distinguish applicated methods. A similar evolution has became in analisys tools:
programming (sometimes) it's necessary to project before coding. A sw project pass through analisys phase. This phase is based on tools and methods that evolved with the rest of informatic world.

First kind of analysis has been the classic or bottom-up one that comes from car industry. A problem was followed from starting to end defining some macro blocks and solving on the way problems. Then there were some refining phases up to solution.
In 80's there was a boom of structured languages and analisys. That kind of analysis required a strong initial model where all procedures should be defined. There was a period when a lot of moneys where spent in initial models and projects left before a single code line has been written.
Nowaday initial model is required as soon as possible to start to collect comments from end user. After first stage a new project/model is developed and tested. Then a serie of test-development-release phase is started until the sw reaches a good working condition.

Some analisys tools are meta-languages that allows to express the problem without the care to P.L. sintax. Sometimes these tools are complex and requires a new effort from programmer. Despite finding it very good I find very difficult to learn the use of UML that is one of more used analisys languages. What is needed to say is that analisys costs are often discouraging customers to start project and this situation is more visible in Italy than in other UE countries.

- Conclusions -

Who followed me up to this point now has a thousand of arguments to use as answer to initial question. Now I want to say that automation should not to overcome his profit or convenience. It should have a good profit that make us able to make tecnologies grow. Too much technology can destroy standard industry procedures and the sense of disease in users can descrease his productivity destroing automation advantages.

for any mistake You find (espacially in translation) please write to me...

di Rudi Giacomini Pilon