Protactile | |
Familycolor: | sign |
Also Known As: | Protactile American Sign Language |
States: | United States |
Region: | Washington, Oregon |
Fam1: | Francosign |
Fam2: | FSL–MVSL |
Fam3: | American Sign |
Protactile is a language used by deafblind people using tactile channels. Unlike other sign languages, which are heavily reliant on visual information, protactile is oriented towards touch and is practiced on the body. Protactile communication originated out of communications by DeafBlind people in Seattle in 2007 and incorporates signs from American Sign Language. Protactile is an emerging system of communication in the United States, with users relying on shared principles such as contact space, tactile imagery, and reciprocity.
In 2007, a group of three DeafBlind women working at the Deaf-Blind Service Center in Seattle, aj granda, Jelica Nuccio, and Jackie Engler, communicated with each other using American Sign Language (ASL) through the use of interpreters.[1] Using ASL required the group to either use interpreters to communicate simultaneously or limited their conversation to just two people communicating at a time (using hand over hand signing). The three worked together to devise ways to talk with each other directly, using their sense of touch as the primary source of information.[2] They began inviting other DeafBlind people into their conversations and interacting using these new communication practices.
In describing the origin of protactile, granda and Nuccio write:
Protactile has emerged in communities of people who were born deaf, learned ASL as children, then gradually lost their sight over decades, as is common in Usher syndrome.[3] Leaders and educators granda and Nuccio describe a "protactile movement" as empowering the DeafBlind community with a sense of community, with a language in DeafBlind people's preferred modality providing a remedy to the isolation imposed by hearing and sighted culture.[4] They describe a protactile philosophy as supporting DeafBlind culture, relationships, and politics. Protactile is described by Helen Keller Services for the Blind as "much more than a system of touch signals," instead "a philosophy and a movement which focuses on autonomy and equality for people who are deaf-blind."
In protactile, communication takes place by touch and movement focused primarily on the hands, wrist, elbow, arm, upper back, and when in a seated position, knees and the top of the thigh.[5] In formal instruction of protactile while sitting and facing a conversation partner, the "listening hand" has the thumb, index finger, and pinky extended, and is rested on the thigh of the other participant.[6] For example, several rapid taps on the thigh with all four fingers would indicate "yes," where a rapid back and forth brushing movement with the fingers would indicate "no."
Tactile maps are used in protactile, communicating spatial information about the environment to the DeafBlind person. A map can be drawn on a recipient's hand, arm, or back to describe surroundings or give directions.
Instead of the "air space" used in visual sign languages, that is, the space around a signer's body, protactile is rooted in "contact space."[7] While ASL and other sign languages rely on handshape as one of the core components distinguishing a sign from other signs, in protactile the handshape is less important than the sensation received (for example, a series of tapped signs using different handshapes would all just be received as taps, with the handshapes being indistinguishable).[8]
A significant innovation in protactile involves the concept of reciprocity.[9] Communication partners are encouraged to use the same communication method (as opposed to using signed or spoken language along with protactile) to ensure vision is not unduly privileged. Sharing experience is a core principle of protactile, with tactile imagery evoking sensations in storytelling in the same way that facial expressions do in a conversation between sighted people.
Serving the same function as body language or verbal acknowledgments (such as "mm-hmm" or "yeah"), tactile backchanneling allows for smoother communication in protactile conversations. Tapping the partner's arm or leg during pauses or as confirmation of understanding serves as a continuous loop of backchannel feedback. Agreement, disagreement, laughter, and other responses are signaled using manual cues. These cues are not standardized, but are developed according to the needs of the individual and specific situation.[10]
The DeafBlind Interpreting National Training and Resource Center was launched in 2017 as a resource for deafblind people.[11] The Center staff work to train protactile interpreters; as DeafBlind author John Lee Clark writes, "instead of providing 'accurate and objective information' in a way that unsuccessfully attempts to create a replica of how they're experiencing the world, Protactile interpreters must be our informants, our partners, our accomplices."
A grant from the National Science Foundation led to the creation of a hybrid learning environment for young deafblind children.[12] The DeafBlind Kids! website provides parents and caretakers with information about protactile concepts such as tactile exploration, backchanneling, and co-presence.
Protactile communication fosters inclusion and autonomy by providing DeafBlind people with more information about their environment.[13] More robust communication leads to fewer misunderstandings and more sense of involvement and connection.