Tuesday, August 31, 2010

CDC Creates Seasonal Flu Videos in ASL

CDC Logo
The Centers for Disease Control and Prevention (CDC) now has videos about seasonal flu available in American Sign Language.

To access the videos, click here.

The CDC partnered with the University of Rochester Prevention Research Center and the Deaf Wellness Center to create the videos, which are geared towards not only ASL speakers but the Deaf frame of mind. According to Mindy Hopper, a Deaf doctoral student at the University of Rochester, "Most Deaf people are visual learners…They gain knowledge when the information is conveyed in a shared, natural, and intelligible language [...] The way the video is designed, Deaf viewers can capitalize on not only the textual information but the conceptual information."

Monday, August 30, 2010

How Do You Handle Car Trouble?

Our new car.
Recently my husband and I bought a new car, a 2010 Chevrolet Cobalt. We needed a second car since he has a long commute now and my little truck just isn't that reliable.

In fact, trouble with the truck has got me thinking. How do deaf people deal with car trouble? By that I mean all of the audible signals that something is wrong with the car.

I've run into issues with my truck that could have been picked up sooner if I was just able to hear the truck better. A lot of signals cars make end up being high pitched and I just don't hear them. For example, right now my brakes are squealing, but I don't hear anything. I just feel they are grinding when I press on the brake pedal.

The truck also makes a dinging noise if it thinks I left the key in the ignition and have opened the door. 99% of the time I didn't actually leave the key but it thinks I did. (My truck is a little quirky, as you can tell.) Usually I don't hear the dinging right away. I open the door, think I heard something, look around, listen, and finally realize it was dinging.

Luckily my husband is in the truck and drives it often enough that he hears what I don't. But thinking about driving often without someone else giving feedback makes me wish that more cars had visual as well as auditory signals. That wouldn't be practical with things like mechanical failure, when the sound isn't necessarily intentional but something broken. But signals such as the key in the ignition, lights left on, etc. could easily be visually indicated as well.

The new car does a pretty good job of being visual as well as auditory. There's a ton of information on a screen on the dash. The car itself is quiet so any sounds it makes would be unusual. I don't know enough about other cars to know what models are good at visual indicators.

What do you think? How do you handle car trouble?

Friday, August 27, 2010

Sonic Alert Responds Awesomely to College Student's Requests

I wanted to pass this story along of a company that makes accessible technology doing things right: I love my Sonic Alert alarm clock (the heart-shaped one). My audiologist sells them, and I've had one since I was in high school. These alarm clocks have an extra-loud alarm, and they also have a vibrating accessory which vibrates the bed or pillow, to wake people up. They're great for the deaf but also for heavy sleepers, like Justin, a college student who wrote to Sonic Alert asking for a few features their alarms don't have.

For one, the alarm clock Justin has doesn't time out its alarm. It will keep going - vibrating and sounding the alarm - until it's physically turned off. Secondly, Justin wanted the clock to be atomically set, so it won't lose minutes over time.

Rather than ignore his suggestions or just write back and say that they were taking them into consideration, Sonic Alert replied with the reasoning behind both settings and sent Justin a travel-size Sonic Alert alarm, which does time out - for free.

Now, that is cool. That's a company that is dedicated to its customers. I was just looking at Sonic Alert's website, and I never realized they make much more than just alarms. They have signaling systems, amplified phones and listening devices. Very cool.

Wednesday, August 25, 2010

Verizon Foundation Gives $55,000 to Massachusetts Nonprofits Aimed at Accessibility

Yesterday, Verizon awarded $55,000 to Massachusetts nonprofits that are dedicated to helping people with disabilities. Of that, $35,000 went towards nonprofits that can help deaf people, with the rest going to the Lowell Association for the Blind and the National Braille Press. Those that will assist deaf people are:
Verizon's philanthropic department is called the Verizon Foundation. According to their webpage, "The passion to lead corporate philanthropy efforts and create a stronger, wiser, and healthier nation and world stands at the core of the Verizon Foundation's mission. That passion, in turn, stems from the hearts, minds, and hands of Verizon's employees, Verizon's retirees, and partnerships we create with nonprofit organizations that have deep ties in their communities."

Check out this Forbes list of America's most philanthropic corporations; it's a bit old, but Verizon's on there, along with others.

(via InMyLingo)

Tuesday, August 24, 2010

NASCAR is Loud, But Just How Loud?

A NASCAR track is not one of those places you can go expecting some peace and quiet. While drivers and crew may wear ear protection (though they're not required), fans of NASCAR often don't, enjoying the sound of the cars and the crowd. Now a study by NIOSH - the National Institute for Occupational Safety and Health - has broken down exactly how loud NASCAR is. And yes - it's a threat to your hearing.
#88
 Here's how it broke down. Keep in mind that 85-90 decibels is usually considered the maximum threshold for a person to listen to.

The pit area: up to and exceeding 130 decibels
The stands: 96 decibels
Inside a car during practice: 114 decibels
Racing: up to and exceeding 140 decibels

Here's what the NIOSH recommends as a remedy: "Crewmembers should be afforded the same hearing protection currently provided to drivers—that of custom-molded earplugs with built-in speakers. Workers at race tracks and spectators should also be made aware of the noise problem through education and informational campaigns."

However, as you can see from the comments on the NIOSH blog, telling them not to waste money on the survey (which was requested by the management of a professional racing team) and to "butt out," it remains to be seen if these recommendations are taken to heart.

Monday, August 23, 2010

Norway Develops Smart Ear Protection

Short Canyon near the Jostedalsbreen, Norway
The petroleum industry is one that can be loud and hazardous. In Norway people in the industry report 600 cases of hearing loss each year. Now Statoil - Norway's largest company - is taking technology used by the military and turning it into an ear protection and communication device to be used on offshore platforms.

The device - called QUIETPRO - is more like a digital hearing aid than your typical earplug. There's a microphone on the outside, which filters out sounds and can be controlled by the user, and a microphone inside, which picks up speech as it travels through the skull, eliminating the need for a microphone outside the mouth. The newest device, in testing right now, even has an alert that will tell users when sound is loud enough to potentially damage hearing.

A noise specialist at Statoil, Asle Melvær, says, "Users of the new device do not have to strain to hear what is being said over the radio, and the noise reduction system in the earplug means that the level of sound is adapted to the surrounding environment. On board an oil platform understanding messages transmitted by radio can be a matter of life and death."

Friday, August 20, 2010

MobileASL: A New Video Compression Scheme for ASL Over Cellphones

Texting in traffic
How do you use your cellphone? Do you mostly call people or do you text? What if you had a third option: signing?

I'm a big texter. I recently got an Android phone - an LG Ally - and all of a sudden I find myself using the phone's messaging function or Google Talk a lot. I love how I can see the entire conversation on the phone.

I am not fluent in American Sign Language, but if I were, I know I would love a phone that could show video easily and accurately enough for me to sign with someone else. And luckily enough, researchers at the University of Washington are working on just that: a real-time video compression scheme that will allow people to communicate effectively in ASL over their phone.

According to their website, "...due to the low bandwidth of the U.S. wireless telephone network, even today's best video encoders likely cannot produce the quality video needed for intelligible ASL. Instead, a new real time video compression scheme is needed to transmit within the existing wireless network while maintaining video quality that allows users to understand semantics of ASL with ease."

In order to work on the project, the group conducted field tests by giving study participants the phones for three weeks, and having them answer usability questions after each call. According to one student, "With the MobileASL phone people can see each other eye to eye, face to face, and really have better understanding."

The iPhone has an app called "FaceTime" - I'm not sure if there is a similar app for Android. It's the app you may have seen in all of those commercials for the iPhone 4 lately. To take advantage of FaceTime in a way that will let you sign with it, you need to be in an area with a very fast connection, and of course, you have to buy an iPhone and pay data usage fees. By contrast, MobileASL could work with any phone that has a video camera on the same side as its screen, and FaceTime uses 10 times the bandwith of MobileASL.

With any luck, soon this will be something that all signers will have access to.

Wednesday, August 18, 2010

1 in 5 U.S. Teens Has a Hearing Loss

Teens
The big news in my Google Reader today seems to be the new findings, covered here in USA Today, and led by the Centers for Disease Control and Prevention, that one in five United States teenagers has hearing loss. This is an increase of 31% from the mid-1990s.

According to the article, "Most cases of hearing loss are slight, affecting only one ear and involving mostly high-frequency sounds [...] About one in 20 have "mild or worsening" hearing loss, which can make them struggle to follow conversations or teachers at school."

Interestingly, the study doesn't seem to correlate listening to loud music with the hearing loss - they conducted interviews with several thousand kids and found that kids who listened to loud music for five hours or more a week were not more likely to have hearing loss. Other studies, however, have linked portable music players to a 70% increased risk of loss. Poor health in general can also contribute to hearing loss.

Teens may have a tendency to ignore their parents and think of themselves as invincible, but I remember going through school with a hearing loss. My loss was recognized and I have hearing aids, but I remember feeling stupid for not being able to follow directions or missing out on conversations. I didn't like feeling that way even when I knew I could blame my difficulty hearing. The problems kids might face from a slight hearing loss in school now, especially an unrecognized one, could magnify exponentially by college and by the time they enter the workforce.

Monday, August 16, 2010

Guest Post: Why Hearing Aids are Starting to Sound Cool

The following guest post is from Stuart Spencer of The Hearing Company. All images in this post were provided by The Hearing Company. For information about guest posts on Hearing Sparks, please see here.
 
The Hearing Company is a UK based hearing care retailer which has been dispensing hearing aids for more than 50 years. 

Over the same time period there have massive advances in all things technological and the humble hearing aid is no exception.  

While to many the stereotype of a cumbersome beige coloured earpiece denoting old age persists, manufacturers are doing their best to make modern hearing aids both advanced and desirable.  

A recent model from Starkey offering wireless integration into Bluetooth compatible devices was even profiled in Time magazine as part of a report on Coolest Inventions. 

So where did this journey begin and what have been the milestone achievements in hearing aid design along the way? 

If asked to picture an early hearing aid most of us would recall the metal ear trumpets used by our 19th century ancestors, yet the first artificial hearing aids can actually be dated back to some 2,000 years before when the ancient Greeks used sea shells to assist those hard of hearing.   

Hearing aid development was slow – in fact, very little had really changed between the ancient Grecian shell and the 19th century trumpet.   

With a few slightly updated, although no less unwieldy, models during the late 1800s and early 1900s, it is the last 60 or so years which have seen the hearing aid transcend its bulky origins to become the high-tech, near invisible hearing aids of the present day.

A brief history of the hearing aid 1950 – present day 

1950s: The transistor hearing aid
1950s ear piece
The 1950s was a golden era for hearing aid technology.  Mass production and the revolution in consumer advertising meant hearing aids became cheaper and more widely worn by both sexes.  Gone were the horns, trumpets and tube-like devices of previous decades.  
Thanks to the invention of the transistor in the late 1940s (which amplifies electronic signals and replaced the valve in most electronic devices) hearing aids of the 1950s were smaller – roughly the size of a hand – and more powerful.  Nevertheless, they were still worn outside the body with power packs tied around the waist or strapped to the leg.  

1960s: The Behind The Ear (BTE) hearing aid
1960s battery pack
While those with a hearing impairment in the 1960s still had to carry their hearing aids about their body, they were at least getting smaller and lighter. By the end of the decade they were even small enough to be placed directly behind or above the ear.  

1970s: The analogue ‘NHS’ hearing aid
Probably the image that springs to mind when talking about hearing aids, this beige number, first made available in the UK through the National Health Service in 1974, was the decade’s main contribution to hearing aids. 

Although smaller in size than earlier models, they were still a chunky, analogue (manual) apparatus that sat behind the ear.  While aiding people with a hearing impairment, patients suffered considerable interference (feedback making a whistling noise) and had difficulty hearing in noisy social situations. 

1980s: The In The Ear (ITE) hearing aid
Still an analogue device, the hearing aid had shrunk so much in size that it was now possible to fit the unit inside the external ear.  Although not a wholly attractive model, the ITE hearing aid was more cosmetically appealing than previous chunkier ones worn outside the ear. 

1990s: The Digital hearing aid
The 1990s marked the digital revolution.  The analogue hearing aids of the previous decade were at first replaced by ‘hybrid’ models (part analogue and part digital) and eventually replaced by fully digital models in the late 1990s.  

The crossover was like the transition from vinyl to CD; the digital models were programmed to meet each sufferer’s individual requirements and offered a cleaner, distortion-free signal. 

2000s: Multifunction hearing aids
Although great for improving your hearing in the home, hearing aid users have always struggled when faced with a less than ideal acoustic environment, such as background noise, the theatre or a busy restaurant. 
Multiple ‘memories’ which allow the hearing aid to perform differently in particular situations go a long way to remedying these difficulties.  Additionally, Bluetooth wireless technology allows the user to communicate using telephones and other similarly equipped devices whenever and wherever they want without interference and feedback. 

2010: Hearing aids made cool?
Hearing loss is one of the oldest of the known disabilities and attempts to amplify sound go back centuries.  But while poor vision is widely accepted across the generations, hearing loss remains stigmatised and associated with the elderly. 

Like short-sightedness it is, however, a problem which affects people of all ages and this is something that modern hearing aid manufacturers are successfully addressing. 

The latest In The Ear hearing aids are so discreet they are essentially like contact lenses for ears.  No one apart from the wearer themselves would know that they are using one. 

Hearing aid technology has come so far since the company I now work for first began trading, from bulky boxes worn around the waist to tiny electronic devices worn inside the ear.  

It is almost impossible to predict where we will be in another 50 or even ten years time.  Even now there are hearing aids on the market that use artificial intelligence, recording and storing hearing patterns of the wearer and automatically adjusting to any discrepancies. 

What is really important though is for the industry to work collaboratively on trying to improve the public image of hearing aids.  In the UK we have more than nine million people suffering from some degree of hearing loss, but only two million actually seeking professional advice. 

The introduction of better quality and more socially acceptable hearing aids will result in a new generation of people looking after their hearing as they should.

Thursday, August 12, 2010

Help the Ubuntu (Linux) Accessibility Team

I am a big proponent of open source software and I personally love using Linux at home. Since I began using it on my laptop about three or four years ago, I've found it to be more comfortable than Windows ever is for me - more customizable and more fluid for the way I work.

Accessibility When Computing
In any operating system, accessibility is a big concern. Many people are already familiar with the accessibility features in Windows and additional software that can be used, but when it comes to a smaller operating system - like the one I use at home, Ubuntu Linux - newcomers to the system may have no idea it even has accessibility features. But as Penelope Stowe (a member of the Ubuntu Accessibility Team) is quoted on the blog jonobacon@home, "[a]t the heart of Ubuntu’s philosophy is the belief that computing is for everyone, whatever your circumstances."

Help Ubuntu Developers
On Ubuntu's website, they have a survey that will help developers understand computer users and the accessibility technology they need. You can help out the Accessibility Team by filling out the survey and emailing it to the address indicated on the page.

Currently Existing Tools
Penelope Stowe says " The Ubuntu Accessibility team has existed from the start, providing support to those requiring assistive technology to operate the Ubuntu desktop." What is currently available for Ubuntu users?

Click here for the Ubuntu Accessibility Start Guide, and here's a link to a good rundown of options.
  • Press F5 when booting an Ubuntu live CD* to bring up options for those with varying visual impairments or motor difficulties
  • Orca is a screen reader that uses a combination of speech, braille, and magnification
  • On-screen keyboard
  • Mousetrap is an alternative input system that uses a webcam
  • Dasher is a gesture-driven input system
  • And of course, programs like Open Office (for word processing) and Firefox have their own accessibility options
* A Live CD is a copy of the Ubuntu operating system that can be loaded on your computer without affecting your original operating system, like Windows. It's a great way to try Ubuntu without committing to it.

Wednesday, August 11, 2010

Microsoft Kinect's Patent Says It Knows American Sign Language (Update: No, It Doesn't)

Kinect Sensor
According to Microsoft's patent for their Kinect peripheral, Kinect can recognize American Sign Language.

Microsoft Kinect is a system designed to work with Microsoft's Xbox 360 game system. Not yet released, the system uses a webcam-like attachment to the console, which can interpret gestures and movements to allow a player to play a video game without use of controls. It's a bit like a Wii Remote or PlayStation Move. The system maps the player's body, even in low light, and can recognize small movements.

However, in the Kotaku article about the patent, there is skepticism about the claim of it being able to understand American Sign Language. According to Kotaku user NoSpecies, "Assuming they did add some support for this they would seriously have some problems not only with the speed of the language but also the ability to recognize words from movement no matter how obscured fingers are." The patent could be outdated compared to the eventual product or Microsoft could have plans to further refine their technology to be able to recognize the fine hand shapes and movements of Sign.

This is interesting, and definitely something to keep an eye out for. I could see a very useful American Sign Language instructional game being made from this, as well as other accessibility features that could be implemented.

Update: According to a Kotaku post today, the version of Kinect shipping this year is not as capable as originally thought, and in order to reach the $150 price point, Microsoft "dumbed down" the camera. Microsoft is quoted in the blog post saying, "We are excited about the potential of Kinect and its potential to impact gaming and entertainment. Microsoft files lots of patent applications to protect our intellectual property, not all of which are brought to market right away. Kinect that is shipping this holiday will not support sign language." Disappointing!

Tuesday, August 10, 2010

Guest Post: Digital Hearing Aids Care Tips

The following guest post is from the team at hearing aids seller, HearingDirect.com. For information about guest posts on Hearing Sparks, please see here.

If you are using digital hearing aids to help with your hearing impairment, you already know, they are not cheap. If you look after them and apply proper care, you can significantly increase their lifespan and spend more money on the things you love doing, like watching your favorite team play ;)

These tips have been accumulated over time and would hopefully serve you well... 

Remove it completely when not in use - This step will ensure that the device does not incur unnecessary wear and tear. After removing the hearing aid, carefully wipe it down with a soft cloth or tissue and remove any earwax which might have accumulated at the end of the aid. 

Place it in a safe place - When the hearing aid is removed, place it in a safe place away from children and pets. If possible, store it in an environment which is not affected by extreme temperature extremes. 

Avoid exposing the hearing aid to excessive moisture - The obvious things would be to not wear it while showering/bathing or swimming. Also avoid using hair sprays or gels when the hearing aid is in your ear. 

Try to ensure your ears are dry - before you put your hearing aid back in make sure your ears are dry and clean to avoid the build up of moisture. 

Clean and inspect it often - Be sure to clean the receiver and vent or tubing openings with an appropriate wax removal tool. It is a good idea to ask the seller for instructions on how to prevent damage to the receiver (loudspeaker component). During the cleaning process check the battery compartment contacts for corrosion or rust which will lead to a breakdown.

Replace the batteries as soon as needed - Many hearing instruments have audible low-battery warning signals – be sure to ask your provider what these will sound like; as well as how long you can expect the hearing aid battery to last. It can last anywhere from 2 to 6 weeks depending on the model, type of use etc with time you will get a better understanding about your particular aid.  

Bonus tips - Some insurance providers offer extended warranty policies; and if you are concerned about losing your devices, many insurers allow hearing instruments to be added under household insurances. 

Monday, August 9, 2010

Bohemian Rhapsody in American Sign Language

Just a video to wake you up this Monday morning... ASL interpreter Sam Farley covering Queen's Bohemian Rhapsody in the car:



Are you awake yet?

Saturday, August 7, 2010

Of Triceratops and Rocko's Modern Life, and Captions

Triceratops
The other day we had a brand-new whiteboard installed at work. It's just an ordinary whiteboard for writing notes and information on, with half of it a corkboard to post items. Naturally, the same day it was put up it became covered in a long random conversation about Post-Its. One of my coworkers came in and drew a (very good, actually) picture of Spunky, the dog from Rocko's Modern Life, and another of my coworkers, Caris, and I got into a conversation about Spunky and Rocko. (That eventually led to me dashing his childhood joy with information about how Triceratops is now likely known to just be a juvenile version of Torosaurus - but don't worry, they're keeping the name Triceratops.)

However, one part of our conversation got me thinking. Caris pointed out that Spunky looks like a smaller version of Rocko and referred to Rocko by his species. I thought I heard "wildebeast" so I called him a wildebeast later only to be corrected that Rocko is actually a wallaby - which is what Caris said in the first place.

Now, I loved Rocko's Modern Life growing up. I actually loved all of Nickelodeon's classic shows like Doug, Clarissa Explains It All, Legends of the Hidden Temple, The Ren and Stimpy Show, Rugrats, and Hey Arnold!. I was exactly the right age to enjoy these shows when I was a kid, along with my brother and my neighborhood friends. But I never knew even such basic facts as what species Rocko was, and it took me quite a few episodes of Doug to realize his dog's name is Porkchop.

Why? Well, because Nickelodeon rarely captioned their shows in the early 1990s. I remember enjoying their shows a lot, but finding the PBS shows like Wishbone and Where in the World is Carmen Sandiego? easier to watch, because PBS has a solid track record with closed captioning. Not only did the captions help, but the fact that a lot of those Nickelodeon shows were cartoons only hindered my ability to understand.

Whether or not my future children are deaf, I intend to have closed captioning on all the time on television. It not only helps with understanding shows but it also can develop childrens' vocabulary, spelling and grammar skills if they see printed words constantly.

If they are deaf, I feel grateful that 99% of the programming they'd be watching will be closed captioned. With any luck they will always know that Rocko is a wallaby and that Porkchop is Doug's dog.

Friday, August 6, 2010

6 Behaviors to Increase Clarity on the Telephone

1896 Telephone
Growing up, I rarely ever used the telephone beyond calling family members. I was too intimidated by it. Even in high school, the prospect of calling a friend and not being able to hear them, or getting someone else on the line and not being able to get in touch with the person I wanted, made me nervous. Calling a stranger up was even worse. However, at work I find myself using the phone often. At first I was pretty nervous about it. After all, 95% of the time I would be on the phone at work would be talking to strangers whose voices I was unfamiliar with.

If I do say so myself, though, I am doing pretty well on the phone. I do not use any amplification beyond turning up the phone's volume all the way. I do have trouble when people have thick accents or when the connection is bad. Luckily there are always coworkers around who can help me out. And happily enough people have told me my voice is very kind and clear on the phone. I feel lucky in that way, that the phone works for me and with my hearing aids and that I ended up not needing anything special.

However, I have noticed some common pitfalls that people do on the other end that inevitably leads to confusion. As somebody who is relatively new to using a phone regularly, I wanted to write about these and encourage everyone to communicate as clearly as possible. Whether you are deaf or hearing, it is important that communication is clear. It's not just about the other person, it will also help you get what you want faster.

1. Start and end the conversation clearly. Say "hello" and "goodbye" rather than launching into a question or comment and hanging up after you hear a response. "Hello" and "goodbye" are not as much for you as they are for the other person. They are communication markers. (Raise your hand if you now have the Beatles song stuck in your head.)

2. When asked for a name, do not spell it immediately.
Say the name first, and if asked, or if the person is having trouble, then spell it. When I am on the phone and ask, for example, for an author's name, it confuses me when people start spouting letters at me. Since I work from context a lot of the time in a sentence, I am expecting words, not single sounds like letters. I inevitably miss the first few letters and they have to spell it anyway.

3. Use the same format the entire way through when listing off numbers. For example, if your phone number is 555-8791, do not say "five-five-five, eighty-seven ninety-one" or any mixture of those. Again, it is about what the other person is expecting. Switching from single numbers to two-digit numbers is confusing and impedes comprehension.

4. If a person is typing and you are listing off an account number or something like that, do not say things like "six, three zeroes, nine..." It may take the person longer to type three zeroes than it does for you to say it, so they will find themselves behind. It is also unexpected by the other person.

5. If you need to speak to someone beside you and you are on the phone, excuse yourself from the phone, put your hand over the mouthpiece, speak to the other person, and then return to the line. This avoids confusion from the other person about who you are speaking to. I've replied to questions that weren't intended for me several times. (The most confusing was "How old are you?")

6. If you are in a crowded place, step outside to continue your conversation or remove yourself to a quieter room.

I think a lot of these are kind of elementary, but it surprises me that so many people ignore them. Since you can't see the other person, I guess they have no idea they are being confusing. Hope this list helped.

Thursday, August 5, 2010

Guest Post: Surround Sound – Not Just for Movie Theaters

The following guest post is from Laura, a hearing aid center employee. For information about guest posts on Hearing Sparks, please see here.

As you can see from Hearing Sparks, the landscape of hearing technology is constantly expanding and offering better hearing experiences. Depending on your daily routine and specific needs – which can be found through a hearing test – your ability to hear sounds beside and behind you may be crucial. As a hearing aid center employee, my customers bring up this issue quite often.

Here are just a few tech advancements in digital hearing aids that can help you hear surrounding sounds:

Directional microphones – Some hearing aid models that use these help you track down and focus on sounds that you’re trying to hear instead of those coming at you from other angles.

Noise reduction – Hearing everything around you can be a disadvantage near a noisy air conditioning unit or in blaring traffic. Some hearing aid models have monitors that distinguish speech from noise, helping you concentrate more on what you’d prefer.

Volume preferences – The wrong volume level can remind you of your hearing loss or even worsen it. Many hearing aid brands keep track of your volume, bass and treble, and environment preferences so that you won’t have to constantly adjust them to fit your current environment.

Although there’s a wide array of hearing loss solutions out there, pinpointing exactly what you want and need is most easily done by taking a free hearing test. If you think you may have hearing loss, be sure to consult a hearing aid center professional or doctor.

Wednesday, August 4, 2010

Astronaut Tracy Caldwell Dawson Signs from Space

Astronaut Tracy Caldwell Dawson says, "One thing I have learned is that deaf people can do anything. The only thing they can't do is hear. Maybe one day you can fly into space and live on the ISS."

ISS 3
What's the ISS? And why is Tracy Caldwell Dawson talking about flying into space?

The ISS is the International Space Station, where Dawson has been since April 4 of this year. She's talking about flying into space because it's true - maybe you could, one day, no matter if you are deaf. And she said it in American Sign Language, from the Space Station. You can watch the video here.

According to the above-linked article, in 1992 Bill Readdy also signed a message from STS-42, a Space Shuttle Discovery mission. That video is at the link and you can watch it here:

Tuesday, August 3, 2010

Seeing Voices, by Oliver Sacks

This review was originally published on my Goodreads profile here (hence the references to this blog). I decided to publish it here as an overview of what I think of the book. Further blog posts on certain subjects in the book are forthcoming.

Seeing Voices was originally published in 1989. That was a big in-between year for the deaf. In 1988 Gallaudet students successfully pushed for a deaf president of the university. And in 1990, the Americans with Disabilities Act would be signed into law.

As for me, in 1989 I was three years old. I had not yet been diagnosed with my own hearing loss. I had no idea who Oliver Sacks was, what "deafness" means, where Gallaudet is, or what American Sign Language is. Two years later my worried parents and grandparents would hear that I have a progressive sensorineural hearing loss, which began as a mild loss and has since progressed to a severe loss in one ear and a profound loss in the other.

Since I was "mainstreamed" as a child - educated without special education classes in a typical public school environment - I essentially knew nothing of other deaf people except Helen Keller. It was only when I began to take American Sign Language courses from the local community college (to avoid having to take two years of a spoken foreign language, which just confused me) that I learned of a Deaf culture, a Deaf identity, and the struggle that Deaf people have faced over hundreds of years. In that class I watched videos and read books and learned about the culture from my Deaf professor. And I learned about American Sign Language.

ASL is an interesting language. Although it is functionally very beautiful, with flowing hands and a rhythm all its own, it can be off-putting to people not used to it. The gestures can be forceful (depending on the meaning behind the sign). Facial expressions are exaggerated along with the signing. Deaf people can be pushy: putting themselves directly in your line of sight, smacking you on the arm to get your attention, waving their hands all up in the air. So it was kind of uncomfortable at first. But there is something about sign language that draws you in. It feels right when you sign, even if you are hearing. It feels like you are just learning another skill, not another language.

I picked up this book because I've read Oliver Sacks' books before and never realized he'd published one about Deafness and Sign. And because I have a blog, Hearing Sparks, and wanted to write about this book for it, so that people know if they ought to pick it up. I think I will publish both this review and some more in-depth blog posts about the subjects touched upon by the author there.

Sacks splits his book up into three parts, which were all written at different times.

Part 1 is a cohesive and very readable history of deaf people as well as information about deafness (both medical and cultural) and Sacks' own introduction to the world of the deaf. We learn in school about history from the point of view of American colonists (if we are American) and slaves, basically. Reading about decades passing from the viewpoint of the deaf introduces not only a third viewpoint but the idea that there are many other viewpoints from which history could be told. In this chapter, Sacks draws a line between prelingually and postlingually deaf. The postlingually deaf are relatively often the most successful deaf people, because they have the memory of spoken language, grammar, sentence structure. Prelingually deaf people face challenges distinct to them, and forcing spoken language on them can lead to unforeseen consequences.

Sacks' position on oral vs. signed education for the deaf is subtly introduced in this part. He isn't forceful or annoying with his position; he simply lays out the way Sign is beneficial for the deaf, particularly prelingually deafened individuals. He closes with a visit to Martha's Vineyard, where nearly one in every family on the island was affected by deafness and every single individual knew Sign, deaf or not. (The knowledge of Sign drifted away as Martha's Vineyard became focused on tourism.)

Part 2 is a systematic view of American Sign Language itself and the way people naturally create grammar and syntax from essentially nothing. This is the longest chapter, and unfortunately suffers from an excess of footnotes and a rather dry tone. As usual, Sacks shines when writing about individuals, and the case studies he recounts in this chapter are very interesting. He quite easily demonstrates that American Sign Language is a full-fledged language in its own right, and demonstrates how languages are developed.

Part 3 was the most interesting chapter for me. Sacks details the 1989 student revolt at Gallaudet for a deaf president. He was there, and his writing about the sense of community at the college and the fervor the students felt is very interesting. The protest culminated in the appointment of King Jordan, whose resignation in 2005 would lead to further controversy when the board tried to appoint someone who was not fluent in ASL - only this time the protests also occurred online.

Overall, although parts of Sacks' books are now quite dated, it's still a very interesting read. Sacks does a good job of bringing together a lot of viewpoints, a lot of individuals, and a lot of ideas, and making them all fit together.

Monday, August 2, 2010

NPR Pioneers Captioned Radio

Girl listening to radio
On Election Night 2008, NPR Labs demonstrated their captioned radio technology in Boston, Maryland, Washington, D.C., Denver, and Phoenix. Now they are demonstrating a unique dual-view screen in the car, which allows passengers to view captioned radio while the driver sees navigation information and hears the audio of the radio.

This captioned video on Youtube does a good job of demonstrating the technology, although it does not go into specifics about how the captions work. While watching the video demonstration of the car technology, it's apparent that the captions are slightly behind the audio.

According to a press release from NPR, "The technology takes advantage of digital radio transmissions to send a closed-captioned transcript of a live broadcast to the screen on a specially built receiver." I wonder what happens when people deviate from the script. It would be neat if this has built-in automatic captioning via voice recognition the same way Youtube's videos do.

NPR also has new technology out for people who are blind or have low vision. It's called Personalized Audio Information Service, and it allows these people to have greater choice in choosing their radio programming by subject. It would also ensure that HD radios can alert listeners to emergency messages.

(via About.com Deafness)