Webster is definitely
outdated on this, the word is in a different
category today: we don't use 'algorithm' as a
mass noun. An algorithm is a concrete
recipe for execution. It is not limited to
calculation. So we need a better definition.
An algorithm is a specific combination of discrete steps, observations or actions, into a complex method. It is described in terms of elementary entities, objects, properties, actions, processes and events. The algorithm itself consists of the way in which these are combined into a whole method. The execution of an algorithm is the sequence of actions resulting from applying the algorithm - the result varies depending on the circumstances, which may lead to different outcomes.
A familiar kind of algorithm is the cookbook recipe. Elementary entities (ingredients and tools), observations ('a teaspoonful', '300 g', 'boiling', 'fully dissolved'), actions ('chop into small pieces', 'mix', 'bring to the boil'), processes ('melt', 'boil') and events (in cooking, always time having elapsed, or changes of state of the food being prepared) are the elements; the recipe is the algorithm describing how to combine them in order to create a specific dish.
This also applies to computer algorithms, except that they don't work with food, they work with what is available in a computer. A computer program is a concrete implementation of one or more algorithms.
Note that when using the term in this very general sense, algorithms can work with basically anything, not just numbers, and they aren't necessarily stepwise processes executed one by one by a single processor: they may include continuous processes, external, unpredictable events, and nondeterministic, concurrent, or distributed actions. A specific type of activity is calculation, in which all activity is : it is stepwise, the steps being executed one after the other by a single processor; the problem domain is described in terms of a complex, but discrete state, the kind that can be represented on a computer, for instance, lists of numbers; all observations are tests of state, all actions are changes of state, and no external state changes occur during the algorithm's execution. Such algorithms of calculation are the type usually discussed in theoretical computing science, which has an extensive theory on the properties of such algorithms and the relationships between them. A calculation is the result of the execution of such an algorithm.
A nondeterministic algorithm does not uniquely determine the course of action at every point, but leaves a choice between a set of options. We usually think of algorithms as deterministic, but this particular notion is important in the theoretical study of algorithms, summarised in the question, still open in theory, whether or not P = NP.
Calculation is often numerical, and the classical algorithmics of algebra, of addition, subtraction, multiplication and division, are greatly emphasized in school. Some well-known numerical algorithms, such as Euclid's algorithm to detemine the greatest common divisor, are known from ancient times.
Now that computers are everywhere, algorithms in different types of data become more important: searching and transformations on text strings, image manipulation algorithms on vector-based or bitmapped images, manipulations on databases, document structures, or other forms of structured data.
Part of the art of algorithm design is good use of complex data structures.
Different representations of data typically have different efficiency characteristics. For instance, lists of items can be stored in an array, linked list, or hash table, each with different average cost in time and space for various typical access and update operations..
dido makes the interesting point that an algorithm is supposed to terminate on all input -- in other words, it must express a computable function. The main reason I find this interesting is that I do not consider this requirement to be part of what an algorithm is, while dido does. For example, for me the famous Collatz sequence describes an algorithm, while dido cannot tell. Some Googling demonstrates that, like dido and me, dictionaries of computing science disagree on this matter. So I will just leave you with a reminder to carefully examine whether the definition of "algorithm" you encounter assumes guaranteed termination or not.