A specific theory in functionalism may posit certain structures and processes and it may make predictions, say, about behaviour, reaction time or error patterns. W.r.t. structures and processes, several levels of explanation are possible.
Daniel Dennett's (1978) levels of increasing abstraction for the description of an information-processing device:
Intentional = Directed upon
Intentional content of a mental state = its semantic properties i.e. propositions/ symbols/ representations
Intentional systems (Dennett, 1971; 1978): Have rationality and purpose, beliefs and desires, i.e. have at their disposal goals, procedures and strategies. You would try to outwit a program just as you would try to outwit a person.
Intentional systems are supposed to emerge at a certain level of complexity. From a methodological viewpoint complex systems are better studied at an intentional level.
Marr in 1982 identified essentially the same levels in his classic analysis of levels of description for an information-processing device: (More on Marr's work in Perception in Semester 2)
Stillings et. al., 1989 pp. 328-332 example of a chess-playing computer - following Dennett:
The three levels identified by Dennett/Marr are strictly independent. For example, when discussing the computations used by a device, we are not interested in the algorithms it employs or its physical structure.
Which of the levels is Psychologically the most important (for understanding an information-processing system)?
For Marr the target of explanation is the algorithmic level, but he claims that this level is best approached from the top-down: by looking at the problem-situation confronting the system. Marr argues that the nature of the algorithms underlying cognition are more fully constrained by the computational problems they confront, than by the physical structure of the system in which they are implemented: given that the algorithmic level is not open to direct observation, the most promising approach for an understanding of the algorithmic level is in a top-down manner. Dennett agrees that we ultimately desire an explanation at the algorithmic level. To stop at the intentional level is to leave intelligence and rationality unexplained.
Explanations at these two higher levels have had considerable success, fruitful research programs have emerged, but ontological considerations show that the problem of mind-body dualism is unresolved.
The above descriptions of levels neglect the role of Anthropology - the cultural or sociological level of explanation is presumably considered to be outside the scope of cognitive science. Gardner's proposal for the three levels is an exception: