AAC AND THE MANIPULATION OF MODALITIES
I. What is a Modality? Augmentative and Alternative Communication is a story about Modalities and how they may be switched when one becomes disabled. But what is a Modality? The word Modality like the word Spouse can mean different things depending upon who is using it.
To some the word Spouse means a companion, a friend and an object of affection. To others it may mean a monster, a liar or a jailor. Likewise, many disciplines find a use for the word ÒModality.Ó
To the Theologian, the word Modality is used in Christianity to refer to the structure and organization of the local church. The Universal Catholic Church is the modality as described in Catholic Theology.
To the Lawyer
Modality refers to the basis of legal argumentation in United States constitutional law.
To the Musician it is a subject concerning certain diatonic scales known as musical modes. ItÕs also why I never learned to play the piano!
To the Sociologist, it is a concept in structural theory.
To the Philosopher it is the qualification in a proposition that indicates that what is affirmed or denied is possible, impossible, necessary, contingent and some other things. So much for these notions, and there are others. But letÕs look at some that are more relevant to a discussion of AAC.
II. A MODALITY IS A SENSORY SYSTEM AND MORE.
To the Medical Profession a Modality may be the faculty through which the external world is apprehended. Hence, it can refer to a sense-organ or a specific sensory channel (system), such as in vision or hearing. Seeing and hearing, of course, are not just functions of the transducers (i.e., the eyes and ears which convert stimuli from the environment into analogous patters of electro chemical impulses in the brain). They involve a complex neurological infra structure that exists beyond the transducer to organize and interpret the stimuli. This infra structure can be portrayed adequately for our purpose by MysakÕs Model, an Òoldie but a goodie,Ó with a few adaptations.
Severe memory disorders due to retardation or brain trauma may inhibit not only communication but the ability to cope with daily living routines. AT can be very useful here in providing systems and/or mechanisms to bridge the gap. Picture schedules, generated by a computer, for example can allow a child or adult with a severe memory deficit to participate and even take responsibility for some daily activities.
In AAC, where oral speech is not the output source, these channels can have a different meaning. For example, in Computer Technology, a Modality is a path of communication between the human and the computer. Hence, these channels may represent modalities by which a patient may communicate with a computer, such as by touch (direct selection) or by a mouse (Proportional control), or through a switch in conjunction with a scanning system. This extends the concept of a modality beyond the limits of the body Sensory System to an AAC device that now becomes part of the communication pathway. In fact, this notion is embodied in the etymology of the word ÒModality,Ó which involves the employment of any therapeutic agent. The number of different kinds of devices in the world of AAC that are available to fit into this pathway, of course, is quite large. This provides a plethora of options for the SLP to consider in planning a program of rehabilitation.
To Some Education Theorists, Modalities are Learning Systems which can be reduced to channels such as the visual, auditory and motor modality. Visual children tend to learn by watching and looking at pictures and may be easily distracted by movement and action in the classroom. Auditory children tend to learn by being told, respond to verbal instructions, and may be easily distracted by noise. Those who respond to motor/kinesthetic stimuli tend to be involved and active, and would rather do than watch, and prefer 'hands on' projects. Language skills have also been similarly classified by modality. The Illinois Test of Psycholinguistic Ability is a case in point. It makes an assessment of the Auditory and Visual modalities to determine which is functioning the best for learning and communication, and which may be significantly impaired, and if so, where the breakdown in the pathway may have occurred.
The concept of Learning Modalities is useful in AAC. It provides a framework for understanding the redundancy of language in the pathways of the brain. We will review three Learning Modalities for this purpose. These are the Auditory, Visual and Haptic Modalities. Of course I am hedging to save paper, as professors love to do, because the Haptic Modality itself is really a composite of three separate Modalities: Tactile, Kinesthetic and Vestibular. So there are really five! Well, there is more than that if we consider gustatory and olfactory tracks, but we wonÕt at least for now. Each modality, with the possible exception of the Vestibular (the sense of balance), can support language and the communication processes. When communication breaks down (or fails to develop), it is necessary to examine each modality in detail to determine where the break may have occurred and what alternate routes may be possible without and/or with the aid of technology. This is referred to as Task Analysis. Failure to do this can have tragic consequences! Take for example, the story of Julia.
Julia was a young woman who was afflicted with a stroke which left her almost totally paralyzed from head to toe. All she could do finally was make a kind of a guttural sound, but no speech. The doctors, nurses and family assumed she had no language, and hence everyone talked in front of her. Frequently they referred to her as a vegetable, and made jokes or other unkind statements, assuming she could not understand. In truth, Julia had considerable language capacity. Only her expressive language was impaired. In terms of receptive and inner language, she was quite normal. Hence, she understood and endured with anguish all that was being said. In addition, she suffered from the terrible isolation that occurs when the language bridge is broken. It was six years before someone became suspicious that she was not a "vegetable," and began to explore her actual language abilities. Finally she was freed from her body prison through AAC. Had some one analyzed her language processes more thoroughly in the beginning, she would not have had to suffer so severely so long!
Please click here to see the complete Julia file.
History is full of similar cases. The well known term "deaf and dumb" is testimony to the old notion that deaf individuals who cannot speak have little language capacity. To the contrary, they are lacking in only one area of language processing (the receptive transducer for the Auditory Modality). The classic example, of course, to repudiate this idea, is the life of Hellen Keller. She was, of course, both Deaf and Blind and still learned to communicate by speaking!
Helen Keller
.
A. What is a Task Analysis of Language?
We have discussed the processes that occur in the Sensory System, and we are still talking about these processes but now we would like to describe their overall arrangement in each of the Learning Modalities: The Auditory, Visual and Haptic Modalities.
The Auditory, Visual and Haptic Neural Systems
For each of the modalities, these processes may be distributed among the Receptive, Inner, or Expressive
components of Language. It is important to analyze these components individually when we are assessing the language abilities of a child, or an adult. This would have saved Julia a lot of grief by identifying both her language weaknesses and strengths.
A. Receptive and Expressive Language Skills: What identifies a Receptive (or Inner) Task as opposed to an Expressive Task. A Test of Receptive and Inner functioning relies on minimal voluntary responses, using forms such as pointing, nodding, grunting, blinking, twitching and other similar behaviors, which I often see in the back row of my AAC class on campus. The emphasis is on the timing, not the efficiency of the movements. Gestures could also fall under this category.
Expressive tests, on the other hand, focus in on the efficiency and competency of the response. Hence, tests that require a complex response are expressive tests. Speaking, writing, drawing, Signing, and pantomime are some examples of a complicated response that we might observe for an expressive task.
B. The Auditory Modality consists of the neural system that extends between the receptive transducer (the ear) and the expressive transducer, the mechanisms for speech. Phonemes, of course, are the basic unit for social communication in this modality. Hence, speech is the most typical form of encoding involved. Morse Code would be another possibility.
These, as we discussed are based on a system of symbols.
Because in Semiotics the manner in which information is encoded is considered a modality, both the stimuli and any motor responses to the stimuli could be considered as additional segments of the modality structure. Other modality segments for the Auditory modality then would include messages that are encoded as signs. Examples, as we discussed earlier, are nominal graded signs (viz., stomach rumbling, burping etc.); nominal combinative
signs (echolalia); expressive graded signs (viz., moans, shouts, crying and laughing, etc.); and expressive combinative signs (viz., swearing, singing, social speech forms such as, ÒHow are you today,Ó idioms, and proverbs, etc.) These provide many possible options for variation in the Auditory Modality, some of which can and do operate simultaneously. These options also provide a redundancy in communication for alternative routing when a segment of the modality is disabled. If speech fails, for example, communication may still be possible through Morse Code.
Yes, it is true that friends of a disabled patient may not be able or willing to learn Morse Code, but a computer will, and can even convert the code into speech!
At this point, then, our Auditory Modality Structure may look something like this:
C. The Visual Modality includes those neural systems that extend between the receptive transducer (the eye) and the expressive transducers (the motor mechanisms required for writing, and/or Sign Language, Pantomime and gestures.)
Writing and Sign Language are based on symbols such as graphemes (written letters) and visual patters of space and movement (the Signs of Sign Language). But there are many other communications based on signs. Examples of these are nominal graded (viz., thrashing and crying); nominal combinative (viz., pointing, gesturing etc.); expressive graded (viz., body ÒlanguageÓ) and expressive combinative (viz., swearing gestures, and social routines like opening a door and letting someone else go first).
At this point, then, our Visual Modality Structure may look something like this:
D. The Haptic Modality is yet another channel within the brain that can support the processes of language. This modality, however, is a composite of two more basic modalities--the Tactile and the Proprioceptive
Modalities.
1. The Tactile Modality is the sense of touch and, of course, is very familiar to us. Its receptive transducer is the system of nerve endings just under the skin. The role of the Tactile Modality in our cognitive development may be underestimated by most of us. Tactile modality plays a major role in the childÕs exploration of the environment. It helps a baby to develop an awareness of the body's limits, of which the new born baby is unaware. It helps us to keep tabs of where we are in space. We can gage much about our body position from what we feel through our feet on the floor and from our seat and back against a chair. Knowing where we are in space is paramount to the development of many language concepts (e.g., prepositional phrases) and language skills (e.g., discriminating "b" from "d" from "q" from
"p." Hence, as you observe a baby, you may notice that they spend much time touching and rubbing against things with their hands, feet, legs, lips and tongue. This is as much a process of serious study and exploration as is the busy bustling of a scientist about his laboratory.
2. The Proprioceptive Modality is also two sub modalities experienced as one: The Kinesthetic and Vestibular Modalities.
3. The Kinesthetic Modality is tantamount our "eyes" looking inward to our own body. The transducer for the Kinesthetic modality is the system of nerve endings in the joints of the body, and in the muscles. Like the sense of touch, it is very important to the development of body awareness. In fact in cases where this process fails, a person can totally loose the awareness of a body part! To the baby, the kinesthetic sense is also a basic ruler for exploring and understanding the environment. Initially, visual (or auditory) images provide no real information to the baby about the properties of referents (things) such as angles, sizes, shapes, distances or mass. This is information is obtained first hand (no pun intended) as the baby comes into physical contact with and manipulates the referents in their environment. The baby's hands, feet or mouth are constantly probing objects that are within their grasp--rattles, blocks, rails on the crib, balls, table legs, fingers, etc. These objects' properties are measured by the kinesthetic sensory system, which calculates and stores body angle, tension and fatigue etc. This information is cross referenced with the Visual and Auditory Modalities to give them a bases for meaning. The Vestibular Modality also plays a role in this exploration by providing a reference in space around which positions of up
and down can be determined. Symbolically, the Kinesthetic Modality alone can support language in the form of Braille, writing, typing and touch Signing to name a few forms.
One goal of early education is to provide young children with as many opportunities as possible to examine many different referents. The ultimate goal is to develop concepts upon which language can be mapped. The story of Montessori is a grand example of using the Haptic modality for this purpose. The Montessori approach stresses a hands-on exploration at an early age. But it is just this modality and these types of experiences that are denied to the severely motor disabled child. This child because of his/her impairment is unable to interact with and explore the environment. The consequence is a lack of information upon which to develop basic concepts about the world and around which language can be developed. Later in life, the motor impaired child may not have as much to communicate about because of this dearth of concepts. Hence, language and communication are impaired two times overÑbecause of a language delay, and because of the motor impairment.
.
At this point, then, our Haptic Modality Structure may look something like this:
E. Cross Modality Processing: When a person listens and then speaks, the communication process is confined to a single modality. But frequently the action may involve two or all of the modalities in consort.
For example in the Peabody Picture Vocabulary Test, the Stimuli are words (auditory) and Pictures (visual). Hence, an additional process of cross modality conversion becomes involved.
Yes, I am a CSUN Peabody Scholar, and IÕm demonstrating a Cross Modality Conversion. Please give me all your pictures of Washington, Jefferson, Lincoln and Hamilton.
Actually, hereÕs a more accurate portrayal of a Cross Modality Conversion:
In this manner all modalities may be involved simultaneously. For example, in a teaching of reading strategy called the ÒWriting Road to Reading,Ó a pupil writes the word (viz., in the sand) for Haptic processing, and says it aloud for Auditory processing and looks at it for Visual processing all at the same time.
F. Modality Extensions: To a medical Doctor, a Modality may refer to the employment of, or the method of employment of, a therapeutic agent. Hence, in the world of AAC, for a patient who cannot speak, the therapeutic agent may be a Communication Board, Pictures
used in a particular manner, or a Computer with speech output among others. This extends the reach of a modality structure beyond the limits of the body to incorporate an external technology, like a computer.
1. In Computer Technology, a Modality is also a path of communication between the human and the computer. Hence, a patient may communicate with a computer by touch (direct selection) or by a mouse (Proportional control), or through a switch in conjunction with a scanning system. And of course, there are many different kinds of computers to be considered that can be used as part of this path. Now our Modality Structure may look like this on the Expressive side:
Ironically, when the expressive modality is impaired to the extent that there is only a minimal motor response available for communication, and an AAC device is deemed to be an appropriate rehabilitative strategy, the role of the Receptive segment of the Modality System takes on a new importance. In order to use an AAC device, a patient must be able to discriminately see, hear and/or touch it. This in itself may require Assistive Technology, like glasses or a hearing aid. But then, the question becomes, ÒWhat is the patient able to decode?Ó
That is, what should there be on the device for the patient to choose fromÑwords, letters, pictures, photos?
2. In Computer Science, especially Computer Imaging, the type of input is considered to be a modality. For example, Black and White would be one modality, and Color would be another. This is equally true for input from an AAC device like a computer which a patient might be using for communication. But there is more. The input may be linguistically symbolic, like phrases, words, and letters; or it may have graphic symbols like pictures or Bliss Symbols; or it might use signs like Happy, Sad, Yes, No or Stop. The pictures may have abstract meaning or iconic meaning or both. A picture of an Apple, for example may mean Òapple,Ó but in combination with a picture of a truck, it may mean Òred.Ó Bliss Symbols are another example of a symbol system that has a high degree of iconicity (i.e., it looks like what it signifies.)
Then again, pictures may be photographs, colored drawings, two dimensional black and white sketch or stick figures. Each in its own right would be considered a modality segment available for plugging in or out of our communication modality structure. So our extended modality structure may look like this on the Receptive side:
(Stimulus)
When we combine ALL the modalities into one picture and take into consideration the wide variety of computers, low tech devices, and no tech strategies that are available, it becomes apparent that there is a vast array of options
for the SLP to choose from in modifying these modalities to meet the needs of a patient. And of course, there is a plethora of issues associated with each which will help determine the choice that the SLP will make. Many of these will be examined in the next Section.