Cued speech

Cued Speech is a visual system of communication used with and among deaf or hard-of-hearing people. It is a phonemic-based system which makes traditionally spoken languages accessible by using a small number of handshapes, known as cues, (representing consonants) in different locations near the mouth (representing vowels), as a supplement to speechreading. The National Cued Speech Association defines Cued Speech as "...a visual mode of communication that uses hand shapes and placements in combination with the mouth movements and speech to make the phonemes of spoken language look different from each other." It adds information about the phonology of the word that is not visible on the lips. This allows people with hearing or language difficulties to visually access the fundamental properties of language. It is now used with people with a variety of language, speech, communication, and learning needs. It is different from American Sign Language (ASL), which is a separate language from English. Cued Speech is considered a communication modality, but can be used as a strategy to support auditory rehabilitation, speech articulation, and literacy development.

History

Cued Speech was invented in 1966 by R. Orin Cornett at Gallaudet College, Washington, D.C.[1] After discovering that children with prelingual and profound hearing impairments typically have poor reading comprehension, he developed the system with the aim of improving the reading abilities of such children through better comprehension of the phonemes of English. At the time, some were arguing that deaf children were earning these lower marks because they had to learn two different systems: American Sign Language (ASL) for person-to-person communication and English for reading and writing.[2] As many sounds look identical on the lips (such as /p/ and /b/), the hand signals introduce a visual contrast in place of the formerly acoustic contrast. Cued Speech may also help people hearing incomplete or distorted sound—according to the National Cued Speech Association at cuedspeech.org, "cochlear implants and Cued Speech are powerful partners".

Since cued speech is based on making sounds visible to the hearing impaired, cued speech is not limited to use in English speaking nations. Because of the demand for use in other languages/countries, by 1994 Cornett had adapted cueing to 25 other languages and dialects.[1] Originally designed to represent American English, the system was adapted to French in 1977. As of 2005, Cued Speech has been adapted to approximately 60 languages and dialects, including six dialects of English. For tonal languages such as Thai, the tone is indicated by inclination and movement of the hand. For English, Cued Speech uses eight different hand shapes and four different positions around the mouth.

Nature and use

Though to a hearing person, Cued Speech may look similar to signing, Cued Speech is not a sign language; nor is it a Manually Coded Sign System for a spoken language. Rather, Cued Speech is a manual modality of communication for representing any language at the phonological level (phonetics).

A manual cue in cued speech consists of two components: hand shape and hand position relative to the face. Hand shapes distinguish consonants and hand positions distinguish vowel. A hand shape and a hand position together, makes up a syllable.[3]

Cuedspeech.org lists 64 different dialects that CS has been adapted to.[4] Each language takes on CS by looking through the catalog of the language's phonemes and distinguishing which phonemes appear similar when pronounced and thus need a hand sign to differentiate them.

Literacy

Cued Speech is based on the hypothesis that if all the sounds in the spoken language looked clearly different from each other on the lips of the speaker, those hearing impaired would learn a language in much the same way as a hearing person, but through vision rather than audition.[5][6]

Literacy is the ability to read and write proficiently, which allows one to understand and communicate ideas so as to participate in a literate society. Cued speech was designed to help eliminate the difficulties of English language acquisition and literacy development in children who are deaf or hard-of-hearing. Results of research show that accurate and consistent cueing with a child can help in the development of language, communication and literacy but why is this so important and how does it happen? This paper addresses the issues behind literacy development, traditional deaf education, and how using cued speech makes such a difference in the lives of children.

Cued speech does indeed achieve its goal of distinguishing phonemes received by the learner, but there is some question of whether it is as helpful to expression as it is to reception. An article by Jacqueline Leybaert and Jesús Alegría discusses how children who are introduced to cued speech before the age of one are up to speed with their hearing peers on receptive vocabulary, though expressive vocabulary lags behind.[7] The writers suggest additional and separate training to teach oral expression if such is desired, but more importantly this reflects the nature of cued speech; to adapt the hearing-impaired to a hearing world, as such discontinuities of expression and reception are not as commonly found for hearing-impaired children learning sign language.[7]

In her paper "The Relationship Between Phonological Coding And Reading Achievement In Deaf Children: Is Cued Speech A Special Case?" (1998), Ostrander notes, "Research has consistently shown a link between lack of phonological awareness and reading disorders (Jenkins & Bowen, 1994)" and discusses the research basis for teaching cued speech as an aid to phonological awareness and literacy.[8] Ostrander concludes that further research into these areas is needed and well justified.

The editor of the Cued Speech Journal reports that "Research indicating that Cued Speech does greatly improve the reception of spoken language by profoundly deaf children was reported in 1979 by Gaye Nicholls, and in 1982 by Nicholls and Ling."[9]

In the book "Choices in Deafness: A Parents' Guide to Communication Options", Sue Schwartz writes on how Cued Speech helps a deaf child recognize pronunciation. The child can learn how to pronounce words such as "hors d'oeuvre" or "tamale" or "Hermione" that have pronunciations different from how they are spelled. A child can learn about accents and dialects. In New York, coffee may be pronounced "caw fee"; in the South, the word friend ("fray-end") can be a two-syllable word.[10]

Debate over cued speech vs sign language

The topic of deaf education has long been filled with controversy. There are two strategies for teaching the deaf that exist: an aural/oral approach or a manual approach. Those who use aural-oralism believe that children who are deaf or hard-of-hearing should be taught through the use of residual hearing, speech and speechreading. Those promoting a manual approach believe the deaf should be taught through the use of signed languages, such as American Sign Language (ASL).[11]

Within the United States, proponents of Cued Speech often discuss the system as an alternative to ASL and similar sign languages, although others note that it can be learned in addition to such languages.[12] For the ASL using community, Cued Speech is a unique potential component for learning English as a second language. Within Bilingual-Bicultural models, Cued Speech does not borrow or invent signs from ASL, nor does CS attempt to change ASL syntax or grammar. Rather, CS provides an unambiguous model for language learning that leaves ASL intact.

Languages

Cued speech has been adapted to more than fifty languages and dialects. However, it is not clear how many of them are actually in use.[13]

Similar systems have been used for other languages, such as Assisted Kinemes Alphabet in Belgium and the Baghcheban phonetic hand alphabet for Persian.[15]

See also

References

  1. 1 2 "All Good Things...Gallaudet closes Cued Speech Team", Cued Speech News Vol. XXVII No. 4 (Final Issue) Winter 1994: Pg 1
  2. http://www.washingtonpost.com/wp-dyn/content/article/2010/09/27/AR2010092705382.html
  3. Heracleous, P. Beautemps, D. & Aboutabit, N. (2010). Cued speech automatic recognition in normal-hearing and deaf subjects. Speech Communication, 52, 504–512.
  4. http://www.cuedspeech.org/cued-speech-in-different-languages
  5. Cued Speech: What and Why?, R. Orin Cornett, Ph.D., undated white paper.
  6. Proceedings of the International Congress on Education of the Deaf, Stockholm, Sweden 1970, Vol. 1, pp. 97-99
  7. 1 2 Leybaert, J., & Alegría, J. (1990). Cued Speech and the acquisition of reading by deaf children. Cued speech journal, 4, 25-37.
  8. http://web.syr.edu/~clostran/literacy.html "The Relationship Between Phonological Coding And Reading Achievement In Deaf Children: Is Cued Speech A Special Case?" Carolyn Ostrander, 1998 (accessed August 23, 2006)
  9. Editor Carol J. Boggs, Ph.D, "Editor's Notes", Cued Speech Journal, (1990) Vol 4, pg ii
  10. Sue Schwartz, Ph.D, Choices in Deafness: A Parents' Guide to Communication Options
  11. National Cued Speech Association (2006). "Cued Speech and Literacy: History, Research, and Background Information" (PDF).
  12. http://www.zak.co.il/deaf-info/old/cued_speech.html Cued Speech FAQ
  13. Cued Languages - list of languages and dialects to which Cued Speech has been adapted
  14. http://www.vinkkipuhe.fi/

External links

Organizations

Tutorials and general information

Cued languages other than English

This article is issued from Wikipedia - version of the 12/3/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.