SHERRILLS FORD – Finding a new normal.
Realizing you’re not as limited as you thought.
Finding the courage to explore new paths in life and enjoy new experiences
Being part of a community where you are embraced for who you are.
These are the concepts that has made Camp Dogwood such a unique experience for the blind and visually impaired in North Carolina.
The camp celebrated its 50th year of service this week.
Greg Capps is an artist and has been visiting the camp for 10 years and appreciates the variety of activities developed for guests through the years.
His time at Camp Dogwood taught him the value of challenging what he thought were his own norms of living. He’s become one of the art teachers (beginning crocheting) at the camp and seeks to challenge
everyone else to think beyond what they perceive as limits.
“When we see people thinking in one way, we will throw something totally different at them,” Capps said “It may not be right for them, but it’ll at least make them stop and think it’s an option. Giving people options is the key.”
Humble beginnings
Camp Dogwood, located on Lake Norman in Catawba County, was started in 1967. It is owned and operated by North Carolina Lions, Inc., according to the camp’s website, nclionscampdogwood.org.
The original idea was proposed by the North Carolina Association of Workers for the Blind, an organization made up largely of alumni of the State School for the Blind in Raleigh. The first building to go up was the “Lodge” in 1968. It is self-funded through donations.
Hilary Brodofsky, Camp Dogwood executive director, said the original dream is still at the core of what the camp does today, to be a safe haven and retreat for the blind and visually impaired.
“More than 700 campers come and join us each summer and we still heavily subsidize the camp to make it affordable for them,” Brodofsky said.
Week-long sessions are held June through August at a cost of $125. Local Lions Clubs help with sponsorships.
Camp director Susan King said many of the campers live isolated lives because of the lack of transportation.
“What we’re trying to do is bring the world to them the one week they’re here,” King said. “As one friend of mine, one of our campers says this is the one place he goes all year where he doesn’t feel
disabled.”
The camp provides people with resources and a support group of their peers who understand the challenges they face every day and who they have shared experiences.
King started at the camp in 2011 and there was a discussion then about enhancing the camper’s stay. It was a more traditional program up to that time – riding horses, water activities and crafts. Since then, King has looked for ways to bring more of the world to the campers.
“We had a painting class, and I got teased a little about that, teaching blind people to paint,” she said. “The approach we took was to make the paintings very three dimensional by adding tactile objects, beads, sand, Spanish Moss, sea shells.”
King even started taking groups to the Hickory Museum of Art.
“They developed a special hands-on tour for us, and they would take us to the Folk Art gallery, which tends to be three-dimensional, created with found objects,” King said.
This was a big step in the camp’s programming.
“Only 10 percent of the visually impaired community actually goes to museums because most are not equipped to provide something for them,” King said.
The camp was invited as well to visit the North Carolina Auto Racing Hall of Fame in Mooresville, which traces the history of the automobile and racing.
“They developed what they call their white glove tour. We go there and experience the exhibits by touch,” King said. “These are things folks who come to our program would never have the opportunity to
experience.”
Campers even get the chance to go bowling, shopping and take computer technology lessons.
“Our older adults who are losing their sight, they weren’t very knowledgeable about computers, but if you can get on the Internet you can find resources,” King said.
“We try to serve what the current needs of the folks who come to camp are from year to year, and it changes from year to year.”
Years of Memories
Belinda Collins has been both a camp counselor and camper the last 18 years. She’s enjoyed the community that develops among everyone at Camp Dogwood.
“Everybody sees each other as an equal here. You may run into somebody, but everyone is always nice and friendly,” Collins said. “They understand. It’s not out in the normal society where people may get upset if you run into them.”
Kierra Houser, 25, grew up helping her mother, who is blind.
Houser spent half her life at Camp Dogwood. She spent five years as a companion to her mother and then at eighteen she decided to apply to work at the camp.
Her mother went blind because of complications from multiple sclerosis.
“Since I’ve been little, I’ve always been that helpful child,” she said. “When I came here I saw people needing help getting around and saw the counselors being busy so I thought I’d help people myself.”
Houser said the time at camp helped bring her and her mother even closer because of all the activities.
During the 50th Anniversary celebration, a 20-year-old time capsule was uncovered and then opened by NC Lions, Inc., President George Suggs. Another was prepared to take its place. Along with the time capsule, visitors enjoyed a delicious Camp Dogwood cookout, live music by Caution Blind Driver, camper Jeff Balek’s band.
For more information about Camp Dogwood, call 800-662-7401 or visit nclionscampdogwood.org.
People with Disabilities Face Barriers to Vote
November 3, 2016 rblakerich
Rob Johnson stopped at the table at a polling station in Rockingham County with his father and his guide dog.
Johnson’s dad said there was a woman sitting in front of him, so Johnson, who is legally blind, asked her for a ballot to vote in the 2000 presidential election.
“It’s right there in front of you,” the woman said. “Right there on the table.”
Johnson said the woman’s attitude made him angry.
“In case you could not tell, I can’t see,” Johnson said. “So how would I know the ballot is right there in front of me?”
In response, the woman huffed at him.
“She just didn’t want to help me,” Johnson said.
Johnson spoke with the polling station’s supervisor and voted with his dad’s assistance. But Johnson said he was upset by the experience.
Matthew Herr, attorney and policy analyst for Disability Rights North Carolina, said these situations when voting are not unusual for people with disabilities.
“Even if someone wasn’t denied the right to vote, they got a lot of pushback,” Herr said. “Either from, you know, staff maybe, treating the person’s disability… them not necessarily being necessarily positive about having to give that person accommodation.”
In North Carolina, about 76 percent of people with disabilities are registered to vote, compared with 90 percent among general population, according to Disability Rights N.C.
Voter turnout in 2008 for people with disabilities was 14.4 percent points lower than the general population in North Carolina, compared with a 7-percentage point deficit nationally, according to a study by Rutgers University. In 2012, that deficit was 5.7 percentage points nationally — 7 percentage points in North Carolina. If people with disabilities voted at the same rate as people without disabilities, there would have been 3 million more voters in 2012, according to a study at Rutgers University.
“I think any barrier to voting is a problem for folks with disabilities,” Herr said. “And there are barriers sort of throughout the whole process.”
Polling stations offer accommodations to help people with disabilities cast their ballot, including a voting machine for people with vision impairments and curbside voting.
Tracy Reams, director of the Orange County Board of Elections, said in Orange County, those machines are set up before the polling station opens, and each station has designated curbside voting parking spaces with signs to direct people.
But Herr said many stations in North Carolina do not set up those machines until someone requests it, so a person who needs accommodations must wait longer to vote.
At other stations, Herr said someone has to go inside the building to request curbside voting, which, he said, “kind of defeats the whole purpose of curbside voting.”
Herr said many workers at polling stations are uninformed about how to assist or interact with people with disabilities. That lack of information can lead to mistreatment of people with disabilities who are trying to vote.
Reams said she has not heard of any issues in Orange County, because they address working with voters with disabilities when training workers at polling stations.
”We let them know how they can provide assistance and who is able to provide assistance,” she said.
Reams said when choosing polling stations, the board only selects places that meet criteria from the American Disabilities Act.
But when studying Orange County in 2010, Disability Rights N.C. found barriers at all three surveyed polling stations, with an average of 2.33 barriers per station.
Herr said many voters with disabilities rely on absentee ballots, but he said barrier to absentee voting concerned him in the 2016 election. A bill signed into law in August, 2013 requires two witnesses on an absentee ballot, instead of one. The law poses a challenge for voters with small social circles, Herr said.
“Getting people to witness their ballot can be a challenge,” Herr said.
Herr said overall, available accommodations are not properly advertised. In fact, Johnson did not know there was a machine that would allow him to cast his own ballot until this year.
After hearing about the accommodations, Johnson contacted the board of elections in Rockingham County. He helped organize an event to educate on voting rights for his support group for people with vision impairments. The event included a demonstration on using the voting machines.
Johnson voted early, and this time, his experience was positive. When he entered the polling station, the machine was already on the table, and he said he cast his ballot without assistance for the first time since losing his eyesight.
“It made me really proud of North Carolina and my county,” Johnson said.
Herr said Johnson made a lasting impact in his county.
“If we have someone in every county doing it, that would end up having a big impact throughout the state,” Herr said.
2016 NC Lions VIP Fishing Tournament
The 2016 N.C. Lions VIP Fishing Tournament is over, and I believe participants and volunteers who attended will say that this year’s three-day event was a huge success. Despite the flooding, damage and road closures caused by Hurricane Matthew, the tournament went on as scheduled and the storm actually had little visual effect on its outcome. In fact, the weather was almost perfect for fishing.
A total of 950 people attended this year’s tournament. Sixty-eight counties in North Carolina were represented. Thirty-nine people came from other states or Canada. Nineteen blind and visually impaired people participated in the National Lions VIP Fishing Tournament, which was held after the state competition.
As a volunteer myself, I would like to make a few comments about the tournament. Putting on such a large event takes a lot of people, many of whom don’t get recognized for their efforts.
Each year hundreds of volunteers travel to the Outer Banks to help out; and members of the local Lions clubs – First Flight, Nags Head, Manteo, Columbia, Currituck, Lower Currituck and Plymouth - work hard all year to raise the money and organize the tournament. Dozens of high school students volunteer; local hotels and motels give discounted rates to provide accommodations for participants and volunteers; and businesses throughout the Outer Banks support the tournament by making financial contributions or providing in-kind services.
The tournament also receives a sizeable donation from North Carolina Lions, Inc., our state’s charitable organization. The five districts make donations that help pay for the event, and some organize transportation to get participants to the Outer Banks. The Lions of District 31-L raise money to purchase food and supplies, and they arrange to have it transported to the gymnasium at Westcott Park in Manteo, where our participants and volunteers eat their meals. Dozens of Lions clubs throughout the state participate in the adopt-a-fishermen program to off-set the registration fees of participants.
NCLI officers, district leaders and Lions from throughout Multiple District 31 show up before the tournament starts to set up the gymnasium and many remain after the event ends to clean up the building. Volunteers cook and serve meals, prepare fishing rods for the competition, cut bait, operate ham radios, serve as nurses, sell merchandise, work with registration, clean bathrooms, haul trash, coordinate supplies, take photos, count golf balls, and they interacted throughout the event with our visually impaired friends.
Thanks should be given to the organization’s board; they are: Robert (Bob) Walton, Ron Curtis, Jana Peedin, Bob Christensen, Gary Newbern, Michelle Wright, Charlie Judge, William (Bill) Hood, Sandra Walker, Don Henry, David Grana, Angelo Sonnesso, Bill Grogg and Hunt Thomas. Each board member has an assigned responsibility during the tournament, whether it’s coordinating volunteer duties, carrying out the fishing competition, organizing the meals, securing entertainment or handling the finances. It takes a year to plan the event and the board meets the day after the tournament ends to start planning for the next year.
Then there are non-board members who have an active role in the tournament, they include: Harriett Walton, Carolyn Dunning, Wayne Wright, Rhett White, Owen White, Anne Metts, Carlton Metts, Diane Whitley, Laraine Dupree, Marti Henry, George Suggs, Tiny Curtis, Norma Hood, Herb Justice, Paranita Carpenter, Wayne Faber, Ron Staley and George Culp. There are dozens more that also work behind the scenes, too many to name, but who also deserve our thanks for the jobs that they do each year to help make the tournament a success.
But special recognition must to be given to Gwen White, who gives countless hours of her time to serve as the tournament’s executive director. To our visually impaired friends and hundreds of volunteers, Gwen is the face of the tournament. She takes our phone calls and answers our questions; she handles registration; she assigns participants and volunteers to the four piers and two head boats used during the fishing competition; she is the contact person for motels and businesses; and she travels around District 31-S, the multiple district and country presenting programs on the fishing tournament.
It was Gwen who introduced me to the NC Lions VIP Fishing Tournament and she is responsible for giving me the opportunity to serve as volunteer for the past 15 years. I am thankful for her friendship and leadership, and for allowing me to work with a dedicated board and hundreds of volunteers on a truly meaningful project that is enjoyed by so many of our visually impaired friends.
By Michael H. Schwartz, Camp Dogwood Volunteer
Memorial Dedication Planned
In July 2013, Boy Scout Matthew West was a guest speaker at our noon Lions meeting. He spoke about a project he was working on to achieve the rank of Eagle Scout. His project was to construct a memorial in Freedom Park which would honor living and deceased military. Our Lions Club made a donation to assist with this project.
Matthew's project is now completed and a dedication service is planned for Saturday, March 7, 12:00 noon at Freedom Park.
Sheriff Sam Page will be speaking as well as a member of the American Legion. Several news crews will be present. Please come out and support this event if possible.
Brain Wave - Psych majors dive into the mind-bending world of sensory substitution.
By Chris Lydgate ‘90
Orestis Papaioannou ‘15 takes a cotton swab, pokes it through a hole in a mesh-fabric cap and gently massages a dab of saline gel into the scalp of his experimental subject, an English major named Dan.
As he adjusts the electrodes in the cap, Orestis - who sports a neatly-trimmed beard, flip-flops, and a t-shirt proclaiming “Life of the Mind” - runs through the experimental protocol. “We’re going to show you some shapes and play some sounds at the same time,” he explains. “You’re going to learn how to translate
sounds back into shapes.”
Dan settles himself in a chair, and dons a pair of headphones while Orestis and Chris Graulty ‘15 huddle around a computer monitor that displays the subject’s brainwaves, which roll across the screen in
a series of undulations, jumping in unison when the subject blinks.
The students are investigating the brain’s ability to take information from one perceptual realm, such as sound, and transfer it to another, such as vision. This phenomenon - known as sensory substitution - might seem like a mere scientific curiosity. But in fact, it holds enormous potential for helping people overcome sensory deficits and has profound implications for our ideas about what perception really is.
Orestis is going to show Dan some shapes on the computer screen and play some sounds through the headphones. For the first hour, each shape will be paired with a sound. Then he will play some new sounds, and Dan’s job will be to guess which shapes they go with.
The first shapes look like the letters of an alien alphabet. There’s a zero squeezed between a pair of reverse guillemets. At the same time, panning left to right, there is a peculiar sound, like R2-D2 being strangled by a length of barbed wire. Next comes an elongated U with isosceles arms: the sound is like a mourning dove
flying across Tokyo on NyQuil. Now a triplet of triangles howl like a swarm of mutant crickets swimming underwater.
To call the sounds perplexing would be a monumental understatement. They are sonic gibberish, as incomprehensible as Beethoven played backwards. But after an hour listening to the sounds and watching the shapes march past on the screen, something peculiar starts to happen. The sounds start to take on a sort of
character. The caw of a demented parrot as a dump truck slams on the brakes? Strange as it seems, there’s something, well, squiggly about that sound. A marimba struck by a pair of beer bottles? I’m not sure why, but it’s squareish.
Now the experiment begins in earnest. When he hears a sound; Dan’s job is to select which of five images fits it best. First comes a cuckoo clock tumbling downstairs - is that a square plus some squiggles or the symbol for mercury? The seconds tick away. Dan guesses wrong. On to the next sound: a buzz saw cutting through
three sheets of galvanized tin. Was it the rocketship? Yes, weirdly, it was. And so it goes for several hours.Something strange is happening to Dan’s brain. The sounds are making sense. Dan is learning to hear shapes.
How it Works: Shapes to Sounds.
The system used at Reed relies on the Meijer algorithm, developed by Dutch engineer Peter Meijer in 1992 to translate images into sounds. The vertical dimension of the image is coded into frequencies between 500 Hz-5000 Hz, where higher spatial position corresponds to higher pitch. The horizontal dimension is coded into a 500-ms left-to-right panning effect. The resulting sound - in theory - includes all the information contained in the image, but is meaningless to the untrained ear.
The interlocking problems of sensation and perception have fascinated philosophers for thousands of years. Democritus argued that there was only one sense - touch - and that all the others were modified forms of it (vision, for example, being caused by the physical impact of invisible particles upon the eye). Plato contended that there were many senses, including sight, smell, heat, cold, pleasure, pain, and fear. Aristotle, in De Anima, argued that there were exactly five senses - a doctrine that has dominated Western thought on the
subject ever since.
The five-sense theory got a boost from psychological research that mapped the senses to specific regions of the brain. We now know, for example, that visual processing takes place primarily in the occipital lobe, with particular sub-regions responsible for detecting motion, color, and shape. There’s even an area that specializes in recognizing faces - if that part of the brain is injured by a bullet, for example, the subject will lose the ability to recognize a face, even his own.
In truth, says Prof. Canseco-Gonzalez, our senses are more like shortwave radio stations whose signals drift into each other, sometimes amplifying, sometimes interfering. They experience metaphorical slippage. This sensory crosstalk is reflected in expressions that use words from one modality to describe phenomena in another. We sing the blues, level icy stares, make salty comments, do fuzzy math, wear hot pink, and complain that the movie left a bad taste in the mouth. It also crops up in the intriguing neurological condition known as synesthesia, when certain stimuli provoke sensations in more than one sensory channel.
Canseco-Gonzalez speaks in a warm voice with a Spanish accent. She grew up in Mexico City, earned a BA in psychology and an MA in psychobiology from the National Autonomous University of Mexico, and a PhD from Brandeis for her dissertation on lexical access. Fluent in English, Spanish, Portuguese, and
Russian, she has authored a score of papers on cognitive neuroscience and is now focusing on neuroplasticity - the brain’s ability to rewire itself in response to a change in its occupant’s environment, behavior, or injury.
To demonstrate an example of sensory substitution, she gets up from her desk, stands behind me, and traces a pattern across my back with her finger. “Maybe you played this game as a child,” she says. “What letter am I writing?” Although I am reasonably well acquainted with the letters of the alphabet, it is surprisingly difficult to identify them as they are traced across my back. (I guess “R” - it was actually an “A.”)
Incredibly, this game formed the basis for a striking breakthrough in 1969, when neuroscientist Paul Bach-y-Rita utilized sensory substitution to help congenitally blind subjects detect letters. He used a stationary TV camera to transmit signals to a special chair with a matrix of four hundred vibrating plates that rested against the subject’s back. When the camera was pointed at an “X,” the system activated the plates in the shape of an “X.” Subjects learned to read the letter through the tactile sensation on their backs - essentially using the sense of touch as a substitute for the sense of vision.
Bach-y-Rita’s experiments - crude as they may seem today - paved the way for a host of new approaches to helping people overcome sensory deficits. But this line of research also poses fundamental questions about perception. Do the subjects simply feel the letters or do they actually “see” the letters?
This problem echoes a question posed by Irish philosopher William Molyneux to John Locke in 1688. Suppose a person blind from birth has the ability to distinguish a cube from a sphere by sense of touch. Suddenly and miraculously, she regains her sight. Would she be able to tell which was the cube and which the sphere simply by looking at them?
The philosophical stakes were high and the question provoked fierce debate. Locke and other empiricist philosophers argued that the senses of vision and feeling are independent, and that only through experience could a person learn to associate the “round” feel of a sphere with its “round” appearance. Rationalists, on the other hand, contended that there was a logical correspondence between the sphere’s essence and its appearance - and that a person who grasped its essence could identify it by reasoning and observation.
(Recent evidence suggests that Locke was right - about this, at least. In 2011, a study published in Nature Neuroscience reported that five congenitally blind individuals who had their sight surgically restored were not, in fact, able to match Lego shapes by sight immediately after their operation. After five days, however, their performance improved dramatically.)
On a practical level, sensory substitution raises fundamental questions about the way the brain works. Is it something that only people with sensory deficits can learn? What about adults? What about people with brain damage?
As faster computers made sensory-substitution technology increasingly accessible, Canseco-Gonzalez realized that she could explore these issues at Reed. “I thought to myself, these are good questions for Reed students,” she says. “This is a great way to study how the brain functions. We are rethinking how the brain is wired.”
She found an intellectual partner in Prof. Michael Pitts [psychology 2011-], who obtained a BA in psych from University of New Hampshire, an MS and a PhD from Colorado State, and worked as a postdoc in the neuroscience department at UC San Diego before coming to Reed. Together with Orestis, Chris, and Phoebe Bauer ‘15, they bounced around ideas to investigate. The Reed students decided to tackle a deceptively simply question. Can ordinary people be taught how to do sensory substitution?
To answer this, they constructed a three-part experiment. In the first phase, they exposed subjects to the strange shapes and sounds and monitored their brainwaves. In the second phase, they taught subjects to associate particular shapes with particular sounds, and tested their subjects’ ability to match them up. They also tested subjects’ ability to match sounds they had never heard with shapes they had never seen. Finally, they repeated the first phase to see if the training had any effect on the subjects’ brainwaves.
“I was skeptical at first,” Prof. Pitts admits. “I didn’t really think it was going to work.”
The trials took place over the summers of 2013 and 2014, and the preliminary results are striking. After just two hours of training, subjects were able to match new sounds correctly as much as 80% of the time. “That’s a phenomenal finding,” says Pitts. What’s more, the Reed team found that the two-hour training had far-reaching effects. Subjects who were given the test a year later still retained some ability to match sounds to shapes.
Just as significant, the subjects’ brain-waves demonstrated a different pattern after the training, suggesting that the shape-processing part of the brain was being activated - even though the information they were processing came in through their headphones. “Here is the punch line,” says Canseco-Gonzalez. “There is a part of the brain that processes shapes. Usually this is in the form of vision or touch. But if you give it sound, it still can extract information.”
The results - which the Reed team hopes to publish next year - suggest that the brain’s perceptual circuits are wired not so much by sense as by task. Some tasks, such as the ability to recognize shapes, are so important to human survival that the brain can actually rewire itself so as to accomplish them.
Does this mean people can really hear shapes and see sounds, or does it mean that the difference between seeing and hearing is fuzzier than we like to think? Either way, the experiment suggests that the brain retains the astonishing property of neuroplasticity well into middle-age - and that we are only beginning to grasp its potential.
FlashSonar: Understanding and Applying Sonar Imaging to Mobility
by Daniel Kish, MA, MA, COMS, NOMC
From the Editor: Consciously or unconsciously, most of us who are blind gather a great deal of valuable information from the echoes that bounce off objects in the environment. Daniel Kish has done extensive study on echolocation and has pioneered methods for training blind children to use it more effectively. Daniel is cofounder and president of World Access for the Blind, a California-based organization that focuses on developing innovative approaches to improving the functioning of blind people. He holds master’s degrees in psychology and special education, and he is a certified orientation and mobility instructor.
On my first day of first grade, the bell rings and all the kids scamper gleefully away. I amble after them, occasionally clicking my tongue, listening for the wall to my left and avoiding chairs left askew. I hear kids laughing and shouting outside through the open door. I hear the sides of the doorway in front of me, and I center myself as I pass through it to the new playground beyond. After a few steps I pause to consider the strange, chaotic environment stretching out before me. I stand on a crack that runs parallel to the building behind me, where the smooth cement turns to rough pavement. I wish my feet were not covered with shoes.
I have no cane; mobility isn’t provided to children my age in 1972. I have been clicking to get around for as long as I can remember. Everyone says I’m really good at it, but I never think about it. It comes as naturally to me as breathing. I click and turn my head from side to side, scanning the expansive space before me, straining to penetrate the heavy curtain of commotion. The world suddenly seems bigger than anything I’ve ever encountered, and noisier, too--teeming with flocks of darting voices, swarms of bouncing balls, and
battalions of scuffling shoes. What is around me? How do I get there? What do I do when I find it? How do I get back?
I find the noise oppressive, like a looming wall that seems almost impenetrable. But curiosity wins out, and I step cautiously forward, clicking quickly and loudly to cut through the cacophony. I follow the clear spaces, passing between clusters of bodies, keeping my distance from bouncing projectiles. From time to time, I click back over my shoulder. As long as I hear the building call back to me through the crowd, I know I can find it again. However, its presence is fading fast. The noise undulates all around me like a thick pall of fog
enveloping my head.
The storm of noise goes on forever in all directions, and I will soon lose the building. Should I head back? A ball skitters behind me and shoes pelt after it. The sounds spur me onward. There must be grass and quiet somewhere, open space like there was on the kindergarten playground.
The pavement starts to slope slightly downward. The building is lost to me now, but I realize that if I find the slope and follow it back upward, it will point me in the right direction. The pressing din gives way to a softer hue, and my clicking inquiries find no reply, suggesting that a very big field of grass lies ahead. With relief I
speed up, eager to find open quietude. My shod feet find the grass, and the heavy fog releases me.
Standing behind a boy wearing sleepshades, Daniel holds a hardcover picture book a few inches from the boy’s right side. The student has turned his head in response. Stimulated by the promise of adventure, I
break into a run, quickly clicking to ensure that nothing stands in my way. I’m free as a bird taking joyful flight. Then, suddenly, something whispers back to me from the open expanse, and I jolt to a stop. “Hi,” I venture in a bell-like treble. There is no reply. As I scan, clicking more softly, the something quietly tells me about itself--it is taller than I am and too thin to be a person. When I reach out to touch it, I know already that it is a pole. I’m glad I found it with my ears and not my head. The pole has a small metal cap on top. I click around me, and barely hear something else whispering back. Leaving the pole, I move toward this next thing as it calls to me with a similar voice, telling me that it is also a pole. I detect yet another one, and another--nine poles in a straight line. Later I learn that this is a slalom course. In time I practiced biking by slaloming rows of trees while clicking madly.
A buzzer abruptly slices the air. I am not startled, but I freeze and raise my hands to my ears. When it finally ceases, I lower my hands to hear buildings from far away calling back to the buzzer. I detest the buzzer, but the distant voices echoing back sound like wistful music. I scan around me, clicking, but I can’t hear the building over the great distance and bedlam of kids. I clap my hands with a sharp report, and something large calls back through the tangle of piping voices and scurrying shoes. I turn in that direction. The grass gives way to pavement, and as I step quickly up the slope, clicking and clapping, I hear the unmistakably broad, clear voice of a wall drawing nearer.
The crowd noise has organized itself and is not quite so assaultive. I hear kids in lines facing the wall. I don’t know why they’re lining up or what I’m supposed to do, and I can’t tell where my classroom is. The wall sounds completely featureless, offering no information. I ask someone a question, and someone points me in the right direction. I start to walk along the crack parallel to the wall, but kids are standing on it. I move in toward the wall, clicking and walking between it and the fronts of the lines until someone calls my name. I
find the right line and, turning away from the building, I click my way along the line until it runs out of bodies, now all quiet as directed by the teacher. I lay my hands on the shoulders of the kid in front of me as I was taught--a boy I would guess by his T-shirt and short hair. As the line moves and we enter the room, I let go, clicking and scanning to avoid kids as they shuffle into their chairs. I click along the wall to my right until I near a corner. Sensing the distance from the wall in front of me, I know I’m near my desk at the end of my row. I reach to my left and find a desk with a Braillewriter on it. I take my seat, wondering how big the new playground is, and if it has a slide. I wriggle with excitement to find out more next recess.
Perceiving the Environment
Through our perceptual system, the brain constructs images to represent everything we experience in our conscious minds. The way we interact with the environment depends upon the quality of these images. When vision is disrupted, the brain naturally works to maintain image quality by optimizing its ability to perceive through other senses. The brain seeks to discover and explore in order to heighten the quality of meaningful information gathered through our experiences. The inability to see with our eyes need not be disabling when the brain learns to “see” with an intact and heightened perceptual imaging system. Indeed, the visual system of the brain is recruited to assist in processing nonvisual stimuli such as echoes and
tactile information. Our approach to long cane and FlashSonar training is thus based in perceptual science in order to activate the imaging system quickly and efficiently.
Cane travel and other areas of perceptual training are integral to our approach to orientation and mobility. If I could redo anything about my childhood, it would be to have a long white cane available to me. We have developed approaches to long cane training for children at their first steps and before. However, this article focuses on FlashSonar, as we feel it is the least understood and most poorly implemented element in standard mobility training.
Both sight and hearing interpret patterns of energy reflected from surfaces in the environment. Reflected sound energy is called echo. The use of echoes, or sonar location, can help a person perceive three
characteristics of objects in the environment--location, dimension (height and width), and depth of structure (solid vs. sparse, reflective vs. absorbent). This information allows the brain to extract a functional image of the environment for hundreds of yards, depending on the size of the elements and strength of the sonar
signal. For example, a parked car, detectable from six or seven yards away, may be perceived as a large object that starts out low at one end, rises in the middle, and drops off again at the other end. The differentiation in the height and slope pitch at either end can identify the front from the back; typically, the front will be lower, with a more gradual slope up to the roof. Distinguishing between types of vehicles is also possible. A pickup truck, for instance, is usually tall, with a hollow sound reflecting from its bed. An SUV is usually tall and blocky overall, with a distinctly blocky geometry at the rear. A tree is imaged according to relatively narrow and solid characteristics at the bottom, broadening in all directions and becoming more sparse toward the top. More specific characteristics, such as size, leafiness, or height of the branches can also be determined. Using this information in synergy with other auditory perceptions as well as touch and the long cane, a scene can be analyzed and imaged, allowing the listener to establish orientation and guide movement within the scene.
Passive and Active Sonar
There are also two types of sonar processing--passive and active. Passive sonar is the most widely used type among humans. It relies on sounds in the environment or sounds casually produced by the listener, such as footsteps or cane taps. The images thus produced are relatively vague and out of focus. Passive sonar may be sufficient for detecting the presence of objects, but not for distinguishing detailed features. It’s a little like hearing the murmur of other people’s conversations around you. You catch bits and pieces, but the information contained therein may or may not be relevant or discernible.
Active sonar involves the use of a signal that is actively produced by the listener. It allows the perception of specific features as well as objects at greater distances than passive sonar. It’s more like engaging in active conversation with elements of the environment. One can ask specific questions of particular elements and receive clearer answers. In fact, scientists who study bats call the process of bat sonar “interrogating the environment.” The bat is actively involved in querying features of the environment for specific information through an array of complex sonar calls almost as varied and strategic as a language. Only recently has it been made clear that humans can learn to do likewise.
Because of its relative precision, active sonar is used most widely in nature and in technical applications. The greater accuracy of active sonar lies in the brain’s ability to distinguish between the characteristics of the signal it produces from those of the returning echo. The echo is changed by the environment from which the signal bounces. These changes carry information about what the signal encounters. In our work with blind students we use the term FlashSonar because the most effective echo signals resemble a flash of sound,
much like the flash of a camera. The brain captures the reflection of the signal, much like a camera’s film.
Perhaps the greatest advantage of FlashSonar is that an active signal can be produced very consistently and the brain can tune to this specific signal. Elicited echoes can easily be recognized and small details detected, even in complex or noisy environments. It’s like recognizing a familiar face or voice in a crowd. The more familiar the face, the more easily it is recognized. The characteristics of an active signal can be controlled deliberately by the user to fit the requirements of a given situation, and the brain is primed to attend to each echo by virtue of its control over the signal.
Discerning the Signals
Tongue clicks can be used effectively to gather sonar information about the environment. The click should be sharp, similar to the snap of a finger or the pop of chewing gum. It can be very discreet, no louder than the situation dictates. Hand claps or mechanical clickers may be used as a backup, but they require the use of the hands and are not easily controlled. Clickers are generally too loud for indoor use. They should never be sounded near the ears, and never clicked more than once every two or three seconds. Cane taps can be used in a pinch, but the signal is poorly aligned with the ears, and it is inconsistent as surface characteristics change. The use of cane taps in this way may encourage unnecessarily noisy or sloppy cane technique.
We find that sonar signals are rarely noticed by the general public, so they do not constitute a concern against normalcy. They generally result in improved posture, more natural gait and head movement,
greater confidence, and more graceful interaction with the environment.
When we teach FlashSonar to students, we start by sensitizing them to echo stimuli. Usually we have them detect and locate easy targets such as large plastic panels or bowls. The idea is to help the student get a sense of how echoes sound. We call this a “hook stimulus” because it hooks the brain’s attention to a stimulus that it might otherwise ignore. Once this recognition is established, we gradually move to subtler and more complex stimuli.
We use stimulus clarification to help a student perceive a stimulus that he/she may not sense, such as an open door or a pole. To clarify the stimulus, we may use a large pole or a wide doorway, or use a reverberant room beyond the doorway. Once the student can detect the clarified stimulus, we return to the original stimulus.
Our most frequent approach is stimulus comparison. We exemplify the sounds of environmental characteristics by using A-B comparisons wherever possible. For example, solid vs. sparse may be shown by comparing a fence near a wall. A high wall could be found near a low wall, or a tree near a pole, or a large alcove near a smaller one. We try to locate training environments that are rich with stimulus variation. The characteristics of almost any object or feature can bebetter understood when compared to something distinctly different.
Stimulus association is the conceptual version of stimulus comparison. Instead of comparing elements in the environment, we are comparing real elements to those in our minds by drawing upon mental references.
For example, when facing a hedge, a student might say, “It sounds solid.”
I might reply, “As solid as the wall to your house?”
“No, not that solid,” she might say.
“As sparse as the fence of your yard?”
“No, more solid than that,” she might answer.
Now we have a range of relativity to work with. “Does it remind you of anything near your house, maybe in the side yard?”
“Bushes?” she might query.
“What seems different from those bushes?”
“These are sort of flat like a fence.”
If she still can’t put words to what she is perceiving, we tell her what the object actually is--a hedge. Ultimately, we have students verify what they hear by touching and exploring.
We also encourage precision interaction with the environment. For instance, we might have a student practice walking through a doorway without touching, with the door closed more and more to narrow the gap, or having a student locate the exact position of a thin pole, and reach out to touch it without fishing for it. We also work on maintaining orientation and connectedness with surfaces in complex spaces. A good example of this is moving diagonally from one corner to another across a very large room, like an auditorium or gym. They learn to hear the corner opening up behind them, while closing in before them, and keeping their line between the two. The world is not made of squares and right angles, but of angles and curves. This
exercise helps stimulate the ability to process nonlinear space. It is surprisingly difficult for many students, but surprisingly easy for others. Once this task is mastered, we place obstacles to be negotiated while still maintaining orientation.
Freedom to Explore
Ultimately, we support students to be able to orient themselves and travel confidently through any space, familiar or not. We practice finding and establishing the relative locations of objects and reference points in a complex environment, such as a park or college campus. The students walk through the area, keeping track of their location with respect to things they can hear and echolocate. They are discouraged from staying on paths, but urged to venture across open spaces. We find and map objects and features until the space is learned. Active echolocation makes this process go much faster.
The most important thing is to allow and encourage blind children to explore their environment without constantly supervising their every move or structuring all of their activities. It is important that they often direct their own movements, not relying strictly on direction from others. The occasional hint is nice, but spoon-feeding our kids all the answers is debilitating; it breaks down the perceptual system. We must remember that the brain is like a muscle. It only gets stronger with self-directed exercise. The earlier this happens in a child’s life, the more comfortable and friendly will the child’s relationship become with the environment.
The mother of a boy we have worked with wrote to us about her son’s progress. I quote from her letter:
“My youngest son, Justin, totally blind, is five ... we introduced Justin to a white cane when he was eighteen months old, ... [and] he ... processes the information he gains from it very effectively. Justin is a very active, outgoing fellow who loves socializing and sports of any kind. ... the work [Daniel] has done with Justin has had tremendous results. ... Walls are easy for Justin to hear. He has moved on to identify parked cars, store displays, other solid objects like newspaper boxes, bushes, and more, all with the click of his tongue. ... If I ask Justin to go and find a ... solid object that doesn’t make noise, he will click his tongue and ... set off in that direction. As he nears it, he will actually pick up speed and become more confident. ... He can then stop short of it. ... The delight on his face when ... he discovers what he has found is unparalleled. ... The other day my husband asked Justin to tell him when the type of fence changed along the street. ... Clicking his tongue, Justin could tell him when the fence changed from brick to wrought iron. ... we had seen Justin using echolocation on his own as a toddler. ... I’m not sure how much Justin knew what he was doing, or how much further hewould have taken it. I know that I have heard a lot of blind adults
say that they use echolocation to some degree. ... But in Justin’s case, with structured training, his potential in this area is beingdrawn out and he is learning to use echolocation more effectively than he would have otherwise. ...
“Probably my greatest strength ... is my ability to teach him social skills. I have a very strong interest in this area, and it shows in who Justin is becoming. He is extremely well spoken, ... very outgoing, confident, and well-liked by his friends and classmates. ... a tongue click ... is hardly noticeable. In fact, unless you were
listening specifically for it, I don’t know that you would notice it. Okay, if you are blind you almost surely would, but I am commenting as a sighted person. It is hardly noticeable at all. ... The tongue click
in no way resembles a blindism or mannerism. ...
“Today Justin does not exhibit any mannerisms. ... Here is something positive that [echolocation] does do. It keeps the head up nicely, because when you click to scan your environment you lift your head up instead of hanging it down. ... We, like any parents, want the best for our son. We want him to be as independent and free as he can be. To give him that, we want him to have access to all the options so that he knows what is possible and can make his own choices. ... I strive to give my child access to all of the resources I can to help
him become who he wants to be. This is one such resource. ... Echolocation training is most definitely helping to accomplish that goal.
--Tricia”
Try It Yourself
For sighted parents, close your eyes and have someone hold a bowl or open box in front of your face. Speak. Listen to the hollow sound of your voice. Now have the box removed and put back. Hear how your voice sounds open or closed in. Try the same thing with a larger box or a pot. Hear how your voice sounds different between smaller and larger objects--perhaps deeper for the large pot.
Try the same thing with a pillow or cushion, and notice that your voice sounds soft instead of hard. Try going to the corner of a room and hear how your voice sounds hollow when you’re facing the corner. How does the sound change when you face the center of the room?
For further information and video demonstrations of our FlashSonar program, please visit .
Low Vision Study
Last month, at the 2014 American Academy of Optometry Annual Meeting, a group of student researchers from the New England College of Optometry presented survey data that identified (a) patient barriers
to low vision services and (b) the actions optometrists can take to improve the efficiency of referrals to low vision specialists.
Their research revealed a discrepancy between what primary care optometrists and low vision specialists define as low vision (i.e., a functional versus numerical definition); in addition, this discrepancy creates a situation in which many patients who could benefit from low vision services are not being referred. The research group concluded that “developing a standardized definition [of low vision] would be advantageous to help normalize the referral and treatment processes.”
This important low vision research, entitled Bridging the Gap: Improving the Efficacy of Referrals from Primary Care Optometrists to Low Vision Specialists, was conducted by Anne Bertolet, Emily Humphreys, Hannah Woodward, Jessica Zebrowski, Inna Kreydin, and Jenna Adelsberger, from the class of 2017 at the New England College of Optometry (NECO).
Their research focused on identifying patient barriers (economic status, physical distance from an office, lack of information) to low vision treatment and what optometrists can do to improve the efficacy of referrals to low vision specialists.
Anne Bertolet explains, “One of the major points our results suggest is that there is a discrepancy between what primary care optometrists and low vision specialists define as low vision. The majority of low vision optometrists use a functional definition of low vision: any visual impairment that can hinder quality of life or daily functioning.”
“Interestingly, we found that primary care optometrists were a lot more varied in their definition, with less than half choosing a functional definition and the rest opting for various best-corrected visual acuity-based definitions.”
“This suggests that there are some patients who could benefit from low vision services, but are not getting referred and that developing a standardized definition would be advantageous to help normalize the referral and treatment processes.”
About Low Vision
If your eye doctor tells you that your vision cannot be fully corrected with ordinary prescription lenses, medical treatment, or surgery, and you still have some usable vision, you have what is called low vision. Having low vision is not the same as being blind.
Having low vision means that even with regular glasses, contact lenses, medicine, or surgery, you might find it challenging, or even difficult, to perform everyday tasks, such as reading, preparing meals, shopping, signing your name, watching television, playing card and board games, and threading a needle.
You can learn more amount low vision, including the differences between low vision and legal blindness, at Low Vision and Legal Blindness Terms and Descriptions.
What Help Is Available?
Doctors who are low vision specialists can provide you with a low vision exam as a first step in determining how you can best use your remaining vision.
Often, a low vision specialist can give you recommendations about optical and non-optical devices and vision rehabilitation services that can help you to maximize your remaining vision and learn new ways of doing everyday tasks.
Some examples of helpful devices that a low vision specialist can discuss with you include:
- illuminated stand magnifiers or electronic aids for reading
- strong glasses or small telescopes for seeing the computer screen,
- reading sheet music, or sewing
- telescopic glasses for seeing television, faces, signs, or other items at a distance
- glare shields for reducing glare and enhancing contrast
- adaptive daily living equipment to make everyday tasks easier, such as clocks with larger numbers, writing guides, or black and white cutting boards to provide better contrast with food items.
In addition, low vision services can include any or all of the following:
- training to use optical and electronic devices correctly
- training to help you use your remaining vision more effectively
- improving lighting and enhancing contrast in each area of your home
- providing a link with a counselor or a support group
l- earning about other helpful resources in the community and state, such as vision rehabilitation services.
More About the Research
Excerpted from Patients Missing Out on Low Vision Services, via Medscape (registration required):
Many baby boomers who could benefit from low vision therapy aren’t getting it for a variety reasons, including a lack of a standard definition of low vision and lack of referral to low vision specialists, a new survey shows.
“Despite the clear advantages, there remains a discrepancy between the number of patients who would benefit from low vision services and utilization of these services,” report investigators from the New England College of Optometry, Boston, Massachusetts.
“While there have been studies geared towards patient barriers (economic status, physical distance from an office, etc), there wasn’t really any research focusing on what we as optometrists could do to improve the efficacy of referrals to low vision specialists,” Anne Bertolet, who worked on the survey, told Medscape Medical News.
The investigators surveyed 19 primary care optometrists who were members of the Massachusetts Society of Optometrists and eight low vision specialists at optometry schools across the country. They asked about low vision definitions, available resources, and referral practices.
Fourteen of the 19 primary care optometrists said they refer patients to low vision specialists. But a major finding, Bertolet said, was the discrepancy between how low vision specialists and primary care optometrists define low vision.
“The majority of low vision optometrists use a functional definition of low vision: any visual impairment that can hinder quality of life or daily functioning,” she explained. “On the other hand, primary care optometrists were a lot more varied in their definition, with less than half choosing a functional definition and the rest opting for various best-corrected visual acuity-based definitions.”
Bertolet and colleagues favor defining low vision as any visual impairment that impedes functionality. “Numerical definitions do not take into account a patient’s quality of life, and may make it difficult for some patients to afford the care that could improve their livelihood,” Bertolet said.
“While the majority of primary care optometrists stated they are providing resources (pamphlets, magnifiers, CCTVs, etc) or educating patients about low vision services and treatment options, most low vision specialists report patients are not aware of the resources available to them at the time of their first visit,” Bertolet said.
“This suggests that there is ineffective communication from primary care doctors to patients in regards to low vision care. Clear communication is especially important in low vision referrals because patients are more likely to follow through if they understand the potential benefits of low vision services,” she explained.
New Call Center Technology Adapts to Visually Impaired Employees
by Ian Barker
Meeting the needs of employees to access systems is an important consideration for any modern business. Now, cloud-based call enter systems provider TCN is making its technology accessible for the visually impaired.
Its Platform 3 Vocal Vision product is optimized to work with Job Access with Speech (JAWS) technology and allows visually impaired call center agents to effectively navigate TCN’s cloud-based contact center
suite, helping to improve agent productivity while also creating new employment opportunities for the visually impaired.
“We’ve been impressed with TCN’s VocalVision cloud-based phone service. They have been very willing to customize the solution to meet our employee’s accessibility needs. So, our employees like the system and its ease of use. It has enhanced the level of service we can offer our customers through its call recording and time reporting capabilities,” says Jim Kerlin, president and CEO of Beyond Vision. “In the future, we plan to use the system to measure and report productivity and utilization metrics, just as we do in our manufacturing environment”.
JAWS assists computer users whose vision impairment prevents them from seeing screen content or being able to operate a mouse. VocalVision helps the agent navigate Platform 3.0’s workflows via hot keys that
use JAWS functionality during both incoming and outgoing calls, while audible tones signal the connection of an incoming call.
Standard features of VocalVision include handling incoming and outgoing calls, agent dashboard along with reporting and call analytics. It works without any need for complex, specialist hardware.
“TCN has always been committed to providing customers with the most advanced call center technology,” says Terrel Bird, CEO and co-founder of TCN. “We are excited to bring Platform 3 VocalVision to the market to meet the needs of the visually impaired community and open new doors for employment.”
Aiming for Accessibility for All --
IEC, ISO and ITU publish new international accessibility guide
We often take easy-to-use products or services and unhindered access to facilities for granted, but when we encounter difficulties in the use of a device or service, the underlying importance of accessibility really hits home.
Standards that include accessibility requirements can support the development of devices and systems that can be accessed by a greater number of users.
With this in mind the IEC (International Electrotechnical Commission), the international standards and conformity assessment body for all fields of electrotechnology, ISO (International Organization for Standardization), and ITU (the International Telecommunications Union) have published a new Guide, ISO/IEC Guide 71:2014, entitled Guide for addressing accessibility in standards, to help ensure that standards
take full account of the accessibility needs of users from all walks of life.
Guide 71 provides practical advice to standards developers so that from the start, they can address accessibility in standards that focus, either directly or indirectly, on any type of system that people use. It covers mainly the accessibility needs of persons with disabilities, children and older persons.
Over a billion people are estimated to live with some form of disability, according to the World Health Organization. This corresponds to about 15 % of the world’s population. Between 110 million (2.2 %) and 190 million (3.8 %) people of 15 years and older have significant difficulties in functioning. Furthermore, the number of persons living with a disability is increasing, due in part to ageing populations and a rise in chronic health conditions.
However accessibility is not only a disability issue. The accessibility and usability of products, services and environments have become increasingly critical for everybody, regardless of age or ability. The prevalence of digital technology in our daily lives is a clear example of the necessity to ensure accessibility for as many
people as possible.
IEC General Secretary and CEO Frans Vreeswijk said, “It is an important goal for the whole of society that all people, regardless of their age, size or ability, have access to the broadest range of systems. Standards that include accessibility requirements can support the development of devices and systems that can be accessed by a greater number of users.”
The new Guide 71 will help those involved in the standards development process to consider accessibility issues when developing or revising standards, particularly in areas where they have not been addressed
before. It will also be useful for manufacturers, designers, service providers, and educators with a special interest in accessibility.
A Glimpse of the Future: Bionic Eye Helps the Blind
Famous for its film studios and music industry, Los Angeles is a city rich in innovation and creativity. Today, it is also home to a raft of companies that are helping to transform lives with breakthroughs in medical science.
At Second Sight Medical Products, based in the San Fernando Valley area, researchers are helping to lead the fight against loss of sight. The Argus II Retinal Prosthesis System is the world’s first FDA approved ‘bionic’ eye, a sophisticated piece of technology that is helping the blind to see.
“It’s the first time that there’s ever been a treatment that ophthalmologists have for patients who are completely blind,” Dr Robert Greenberg, CEO of Second Sight Medical Products, told CNBC’s Innovation Cities.
For more than 30 years Arizona resident Lisa Kulik has been living with retinitis pigmentosa, a degenerative disease which eventually leads to blindness.
“When I was diagnosed… I was pretty much told by every doctor that I saw that there was no cure, and there was nothing that I can do,” Kulik told Innovation Cities.
Today, Kulik has cause for optimism. Last June a chip was implanted on her retina, she was fitted with a small portable unit and given a pair of glasses with a small camera.
“The implant picks up the signal from the video camera wirelessly, and then electrically stimulates the retina at the back of the eye and produces the perception of light for these patients,” Greenberg said. “The best patients have actually been able to read letters and even read words,” he added.
While patients regain just some of their vision and the Argus II is only effective treating certain types of blindness, the impact on Kulik’s life has already been huge.
“The first thing that I saw with my device was the full moon in the sky,” she said. “A few weeks later, [on] the fourth of July, I saw the fireworks, which I haven’t seen in over 30 years,” she added. “It has given me confidence, for sure.”
The World Health Organization estimates that 285 million people are visually impaired worldwide, with 39 million being blind. According to the WHO, 90 percent of people with visual impairments live in low income environments.
This puts the treatment Kulik has received – the Argus II costs 115,000 euros ($148,959) – out of reach for millions, at least for now.
The potential of this kind of technology to transform the lives of the visually impaired is nevertheless huge, according to Greenberg.
“Our next generation product is aimed at putting an implant in the visual part of the brain... directly interfacing to the brain to restore vision, hopefully to patients who are blind from nearly all causes,” he said.
“Talking Guide” Reads Aloud Channel Names, Show Titles and DVR
Commands; Allows For Independent Search and Discovery For People With
Disabilities.
Comcast today announced the industry’s first voice-enabled television user interface, a solution that will revolutionize the way its Xfinity TV customers, especially those who are blind or visually impaired, navigate the X1 platform. The “talking guide” features a female voice that reads aloud selections like program titles, network names and time slots as well as DVR and On Demand settings. The feature will be available to all X1 customers in the next few weeks.
About 19 million U.S. households have at least one member with a disability and according to the U.S. Census, there are 8.1 million people with a visual disability. In 2012, Comcast hired Tom Wlodkowski as Vice President of Audience to focus on the usability of the company’s products and services by people with disabilities.
“Television is universally loved, and we want everyone to be able to enjoy it,” Brian Roberts, Chairman and CEO, Comcast “The talking guide feature will enable all of our customers to experience the X1 platform in a new way, and give our blind and visually impaired customers the freedom to independently explore and navigate thousands of shows and movies. We’re just scratching the surface of what’s possible in the accessibility space and we are thrilled to have Tom and his team leading the charge.”
X1 Talks Remote Control
The talking guide “speaks” what’s on the screen as the viewer navigates the “Guide,” “Saved,” “On Demand,” and “Settings” sections of X1 and includes details like individual program descriptions and ratings from Common Sense Media and Rotten Tomatoes that help viewers decide what to watch. Future versions of the feature will include
functionality within the “Search” section of X1 and additional personalization settings like rate of speech.
“The talking guide is as much about usability as it is about accessibility,” said Mr. Wlodkowski. “We think about accessibility from the design of a product all the way through production and this feature is the result of years of work by our team including customer research, focus groups and industry partnerships. For people like me who are blind, this new interface opens up a whole new world of options for watching TV.”
X1 customers will be able to activate the talking guide on their existing set top box by tapping the “A” button twice on their remote control. The feature also can be turned on via the “accessibility settings” within the main settings menu. Here’s how it works.
“Programming my DVR is one of the most empowering things I have ever done with my TV,” said Eric Bridges, Director of External Relations and Policy Development at The American Council of the Blind (ACB), who
participated in a Comcast customer trial over the summer. “My wife and I are both blind, so thanks to this new feature, we no longer have to choose between going out to dinner or catching our favorite show. The talking guide encourages independence and self-sufficiency; it’s a real game-changer for anyone who is blind and loves TV.”
Next year, Comcast plans to partner with service organizations and nonprofits to create awareness in the disability community of Voice Guidance and other accessibility features that offer a more inclusive entertainment experience.
“TV is such an important and integral part of the fabric of our culture that to be excluded from that experience in any way makes it more difficult for blind people to participate fully in society,” said Amy Ruell, a Comcast customer. “I had a chance to test this feature over the summer and I probably watch more TV than ever thanks to the talking guide. Comcast’s commitment to accessibility is encouraging because it means there will be tremendous progress in developing technology that is universally accessible.”
The talking guide is the latest in a series of innovations created in the Comcast Accessibility Lab. In addition to voice guidance and one-touch access to closed captioning, Comcast created an online help and support resource for Xfinity customers looking for information about accessibility-related topics. The webpage includes an overview of accessibility products and services, support for third-party assistive devices, information related to Braille or large-print bills and the ability to connect with accessibility support specialists.
The company also has a service center specifically dedicated to customers with disabilities. Comcast’s Accessibility Center of Excellence is based in Pensacola, FL, where a team of specially trained care agents handle about 10,000 calls each month.
Electronic EyeCane
User-friendly electronic ‘EyeCane’ enhances navigational abilities for the blind - Tactile or auditory cues can enhance the mobility of blind users, according to study published in Restorative Neurology and Neuroscience...
White Canes provide low-tech assistance to the visually impaired, but some blind people object to their use because they are cumbersome, fail to detect elevated obstacles, or require long training periods to master. Electronic travel aids (ETAs) have the potential to improve navigation for the blind, but early versions had disadvantages that limited widespread adoption. A new ETA, the “EyeCane,” developed by a team of researchers at The Hebrew University of Jerusalem, expands the world of its users, allowing them to better estimate distance, navigate their environment, and avoid obstacles, according to a new study published in Restorative Neurology and Neuroscience.
“The EyeCane was designed to augment, or possibly in the more distant future, replace the traditional White Cane by adding information at greater distances (5 meters) and more angles, and most importantly by eliminating the need for contacts between the cane and the user’s surroundings [which makes its use difficult] in cluttered or indoor environments,” says Amir Amedi, PhD, Associate Professor of Medical Neurobiology at The Israel-Canada Institute for Medical Research, The Hebrew University of Jerusalem.
The EyeCane translates point-distance information into auditory and tactile cues.
The device is able to provide the user with distance information simultaneously from two different directions: directly ahead for long distance perception and detection of waist-height obstacles and pointing downward at a 45° angle for ground-level assessment. The user scans a target with the device, the device emits a narrow beam with high spatial resolution toward the target, the beam hits the target and is returned to the device, and the device calculates the distance and translates it for the user interface. The user learns intuitively within a few minutes to decode the distance to the object via sound frequencies and/or vibration amplitudes.
Recent improvements have streamlined the device so its size is 4 x 6 x 12 centimeters with a weight of less than 100 grams. “This enables it to be easily held and pointed at different targets, while increasing battery life,” says Prof. Amedi.
The authors conducted a series of experiments to evaluate the usefulness of the device for both blind and blindfolded sighted individuals.
A second experiment looked at whether the EyeCane could help individuals navigate an unfamiliar corridor by measuring the number of contacts with the walls. Those using a White Cane made an average of 28.2 contacts with the wall, compared to three contacts with the EyeCane - a statistically significant tenfold reduction. A third experiment demonstrated that the EyeCane also helped users avoid chairs and other naturally occurring obstacles placed randomly in the surroundings.
“One of the key results we show here is that even after less than five minutes of training, participants were able to complete the tasks successfully,” says Prof. Amedi. “This short training requirement is very significant, as it make the device much more user friendly. Every one of our blind users wanted to take the device home with them after the experiment, and felt they could immediately contribute to their everyday lives,” adds Maidenbaum.
The Amedi lab is also involved in other projects for helping people who are blind. In another recent publication in Restorative Neurology and Neuroscience they introduced the EyeMusic, which offers much more information, but requires more intensive training. “We see the two technologies as complementar,y” says Prof. Amedi. “You would use the EyeMusic to recognize landmarks or an object and use the EyeCane to get to it safely while avoiding collisions.”
For more information visit the Amedi lab website at http://brain.huji.ac.ilAmir Amedi’s Lab for Multisensory research. In
the lab cutting edge technologies and innovative methods and techniques are used to study perception and multisensory relation,
sensory substitution approaches and dynamics of brain processes.
Japanese Woman is First Recipient of Next-Generation Stem Cells
Researchers grew a sheet of retinal tissue from stem cells created from a woman’s skin cells, then implanted it into her eye.
A Japanese woman in her 70s is the first person to receive tissue derived from induced pluripotent stem cells, a technology that has created great expectations since it could offer the same regenerative potential as embryo-derived cells but without some of the ethical and safety concerns.
In a two-hour procedure, a team of three eye specialists lead by Yasuo Kurimoto of the Kobe City Medical Center General Hospital, implanted a 1.3 by 3.0 millimeter sheet of retinal pigment epithelium cells into an eye of the Hyogo prefecture resident, who suffers from age-related macular degeneration, a common eye condition that can lead to blindness. The procedure took place at the Institute for Biomedical Research and Innovation, next to the RIKEN Center for Developmental Biology (CDB), where ophthalmologist
Masayo Takahashi had developed and tested the epithelium sheets. Takahashi had reprogrammed some cells from the patient’s skin to produce induced pluripotent stem (iPS) cells. (‘Pluripotent’ means able to differentiate into virtually any type of tissue in the body.) She then coaxed those cells to differentiate into retinal pigment epithelium cells and grow into a sheet for implantation.
The patient had no effusive bleeding or other serious problems after the surgery, RIKEN reported. The patient “took on all the risk that go with the treatment as well as the surgery”, Kurimoto said in a statement released by RIKEN. “I have utmost respect for the bravery she showed in resolving to go through with it.”
He hit a sombre note in thanking Yoshiki Sasai, a CDB researcher who recently committed suicide. “This project could not have existed without the late Yoshiki Sasai’s research, which led the way to differentiating
retinal tissue from stem cells.” Kurimoto also thanked Shinya Yamanaka, a stem-cell scientist at Japan’s Kyoto University “without whose discovery of iPS cells, [such] clinical research would not be possible.” Yamanaka shared the 2012 Nobel Prize in Physiology or Medicine for that work. Kurimoto performed the two-hour procedure a mere four days after a health-ministry committee gave clearance for the human trial.
Proposed Change for Theaters
On August 1, 2014, the United States Department of Justice issued the following notice of proposed rulemaking regarding Nondiscrimination on the Basis of Disability by Public Accommodations—Theaters, Movie Captioning, and Audio Description:
Summary: The Department of Justice (Department) is issuing this notice of proposed rulemaking (NPRM) in order to propose amendments to its regulation for title III of the Americans with Disabilities Act
(ADA), which covers public accommodations and commercial facilities, including movie theaters.
The Department is proposing to explicitly require movie theaters to exhibit movies with audio description
and closed captioning at all times and for all showings whenever movies are produced, distributed, or otherwise made available with captioning and audio description, unless to do so would result in an undue burden or fundamental alteration.
The Department is also proposing to require movie theaters to have a certain number of individual closed captioning and audio description devices, unless to do so would result in an undue burden or fundamental alteration. The Department is proposing a six-month compliance date for movie theaters’ digital movie screens and is seeking public comment on whether it should adopt a four-year compliance date for movie theaters’ analog movie screens or should defer rulemaking on analog screens until a later date.
The Department invites written comments from members of the public. Written comments must be postmarked and electronic comments must be submitted on or before September 30, 2014. Comments received by mail will be considered timely if they are postmarked on or before that date. The electronic Federal
Docket Management System (FDMS) will accept comments until midnight Eastern Time at the end of that day.
You may submit comments, identified by RIN 1190-AA63, by any one of the following methods:
Federal eRulemaking website: http://www.regulations.gov.
Follow the website’s instructions for submitting comments. The Regulations.gov Docket ID is DOJ-CRT-126.
Regular U.S. mail: Disability Rights Section, Civil Rights Division, U.S. Department of Justice, P.O. Box 2885, Fairfax, VA 22031-0885. Overnight, courier, or hand delivery: Disability Rights Section, Civil Rights Division, U.S. Department of Justice, 1425 New York Avenue NW, Suite 4039, Washington, DC 20005.
You can read the full announcement at the Government Printing Office website.
Questions and Answers about the Proposed Rule, Audio Description, and Closed Captioning Excerpted from Questions and Answers about the Department of Justice’s Notice of Proposed Rulemaking Requiring Movie Theaters to Provide Closed Movie Captioning and Audio Description at ADA.gov:
Atlanta Eclipse Beep Baseball
In March of 2010, the first practice of the Atlanta Eclipse Beep Baseball Team took place with about 15 players and 8 volunteers, most of whom knew little to nothing about beep baseball. We all joke about it today, recalling that it was definitely the blind leading the blind!
Beep baseball is an adapted sport for people who are visually impaired or blind and was created in the mid-sixties for the Colorado School for the Blind. Beep baseball is played with an oversized softball that BEEPS and two bases, first and third, that BUZZ. There are 6 players on a team and all wear blindfolds to equalize their vision. The pitcher, who is sighted, is on the same team as the batter. There is no throwing the ball and no running around all the bases. Simply explained, the batter hits the ball and runs to the base which buzzes and tags it. If he gets there before a fielder picks up the ball, he scores a run. If not, he is out. Three outs, six innings, game over!
Beep baseball has been around for decades and culminates every August in a world series. Now two and a half years later, seven Eclipse players just returned from playing on other teams at the 2012 World Series in Ames, Iowa. The best description of going to the World Series is “It’s just like a big family reunion!”
Every year since 1976, teams have reunited to compete for the title of the best beep baseball team in the world. This year, there were 199 players and over 150 volunteers; with the majority of our core group of volunteers still committing to the team most Saturday mornings during the fall and spring. Why? Because they’re hooked on the benefits, one of which is camaraderie. Although playing beep baseball is definitely about getting exercise, learning a skill and having fun; it’s also about developing agility, becoming more independent and realizing “If I can do this, I can do anything!”
After being together for over a year, the coed players ranging in age from 18 to 65, decided to take charge of everything, dividing themselves into three committees- fundraising, socializing and travel. Leaders emerged to take charge of all the meals during games and travel arrangements when playing teams out of town. And besides the players being in charge of seeking out monetary support from groups such as the AT&T Pioneers, they worked hard, putting their own efforts into two fundraisers- selling Braves tickets and selling hotdogs at a local Kroger. The fruits of this labor went to Iowa expenses for some and will go to future World Series expenses for others. And the most exciting news about next year is that the World Series will be held in our own backyard; Columbus, Georgia! So if you would like to join us as a player or a volunteer and go to the 2013 World Series, we would love to have you! Our practices are at Coan Recreation Center and starts mid-September.
To see videos and learn more about how beep baseball is played, go to our Facebook Page - Atlanta Beep Baseball. You can also go to the website for the National Beep Baseball Association at www.nbba.org.
If you would like to help us keep this sport and worthwhile organization alive, contact me at 770-317-2035 or by e-mail at JudyByrd@gmail.com.
Opening up new career paths and educational opportunities for people with vision impairment, the system combines a number of pattern recognition technologies into a single platform and, for the first time, allows mathematics and graphical material to be extracted and described without sighted intervention.
Senior Lecturer Dr Iain Murray and PhD student Azadeh Nazemi of Curtin’s Department of Electrical and Computing Engineering developed the device to handle the extraordinary number of complex issues faced by the vision impaired when needing to read graphics, graphs, bills, bank statements and more.
To develop the device, the team has made use of a number of technologies, mostly based on pattern recognition, machine learning and various segmentation methods.
“Many of us take for granted the number of graphics and statistics we see in our daily lives, especially at work. We love to have graphics and diagrams to convey information, for example, look at how many statistics and graphs are used in the sports section of the newspaper,” Dr Murray said.
“People who are blind are often blocked from certain career paths and educational opportunities where graphs or graphics play a strong role. We hope this device will open up new opportunities for people with vision impairment – it’s a matter of providing more independence, and not having to rely on sighted
assistance to be able to read graphical and mathematical material.”
Senior Lecturer Dr Iain Murray and PhD student Azadeh Nazemi developed a digital reading system for people who are blind, allowing them to read graphical material.The device works by using pattern recognition technology and other methods on any document to identify images, graphs, maths or text. From here
it is then converted to audio format with navigation markup. Basically, the system takes a document, such as a pdf, bill, or scanned document, identifies blocks of text or pictures, segments these into related blocks and arranges these blocks in the correct reading order. Blocks are then identified as images, graphs, maths or text and recognised via optical character recognition or the utility for maths, Mathspeak. It is then converted to audio format with navigation
markup.
The device is 20cm long, 15cm wide and 3cm thick. The controls are very much like a cassette player with a couple of additions for navigating through headings or chapters. Books can be downloaded or posted out on USB storage devices. These books are in a specific format that allows audio playback with navigation
markup, with audio either in synthetic speech or human read. The device will have high contrast keys with tactile markings.
Dr Murray said the system runs on very inexpensive platforms, with an expected production cost as low as $100 per device, allowing it to be affordable to many people around the world and hopefully make a difference in third world countries.
He said previously there have been many methods to convert graphical material but all are very labour intensive and generally not easily transferable to other users.
“Our system is easily operated by people of all ages and abilities and it is open source, meaning anyone with the skill can use and modify the software to suit their application,” Dr Murray said.
The player has built-in user instructions and a speech engine that converts to more than 120 different languages.
Dr Murray said he was now looking for philanthropic finance to set up production.
FCC Proposes Making TV Menus Accessible to Blind
By Julian Hattem
The Hill Newspaper, June 17, 2013
Federal regulators are unveiling draft rules to make cable and television menus accessible to the blind and visually impaired.
The proposal from the Federal Communications Commission (FCC) will require that users have an option have onscreen menus read out loud, helping the
blind understand what's playing on different channels. Cable and satellite menus, as well as other devices, will have to comply with the new rule once it is finalized.
"It's incredibly important for our ability to access entertainment content through the home theater experience," said Eric Bridges, the director of advocacy and governmental affairs with the American Council of the Blind.
The new proposal requires that 11 "essential functions" of TV are accessible, including adjusting volume, obtaining program and channel information and various configuration settings.
The proposal is the FCC's last in a series of new rules to implement the 21st Century Communications and Video Accessibility Act CVAA), a 2010 law. The agency had previously issued rules making cellphones' Internet browsers and emergency TV alerts accessible to blind people, among other measures.
The law sailed through Congress as an attempt to update technology for the modern era, making sure that people with disabilities could take advantage of cellphones and digital tools just like everyone else.
"Generally we've been really pleased with how the FCC has viewed the implementation of the CVAA, and if this is done correctly, this will mean equal access to entertainment content on television," Bridges said. The FCC is scheduled to publish the proposal in the Federal Register on Tuesday and is accepting comments for the next 50 days.
The agency has said it intends to finalize the rule by October.
###
2014 Eden Lion's Club Scholarship Recipients
Taylor Lea Chambers, $500; Amanda Christine Mericle, $1,000; Briana Janelle Millner, $500; Kendall Rose Tuttle, $500
Proposed Freedom Park Veteran's Memorial
All donations are tax deductable - for more information please contact Tina West 336-209-0808 or Matthew West 336-209-9722
The memorial will include 4'x4'x8" laser etched black granite, front side in honor of all veterans, colored American flag and Military seals, back side in memory of those who didn't make it home, MIA & POW, placed on a 12' concrete pad, benches and landscaping.
********************************
How I became a Lion
I honestly did not know what a Lion was before I became blind. When I was a kid or early teen years, there was a Lion I knew in my town named Zizzy Osborne. I only remember him for he would see me out to eat somewhere, come over and introduce himself and give me a peppermint candy. I believe he had a Lion pin on his shirt or something and I began asking questions of who that nice guy was. Perhaps Lion Zizzy knew my Mom and maybe Mom told me more about him but I really don't remember. He must have been a Lion in the Eden Night Club whereas I am in the Day Club. I will always remember Lion Zizzy, but still did not know what a Lion do.
Over 10 years ago, around 1997 or 1998 I started being affected by a genetic disease. My vision was being affected along with memory and leg pain. I was working a lot of hours, had just gotten married, and life was good but as you know men don't like going to a doctor. To make a long story short I finally went to a Wal-Mart Vision center in Martinsville, VA. The person that did the exam gave me a pair of simple glasses, not a prescription saying he did not know what my condition was. My vision kept getting worse along with my memory eventually I was referred to a Neurologist in Greensboro. The Neurologist sent me for a MRI the next week, where it was discovered that I had a rare genetic disease called Adrenoleukodystrophy (there was a true movie made of the disease called Lorenzo's Oil back years ago). This disease was killing me. The doctor gave me a 20 percent chance to live.
So, I was admitted to Duke under the care of Dr. Joanne Kurtzberg. I had many MRIs upcoming through my treatment to monitor how the disease was affecting me. I had 10 days of high dose chemo to wipe the disease out and underwent the experimental “Stem Cell Transplantation”. By using umbilical cord blood from a female baby since the disease was fatal to males but not to females. My vision left me during the 10 days of chemo preparing for the stem cell transplant. The transplant of stem cells from female umbilical cord blood took so at that point I was considered safe from the disease. The many doctors especially Dr. Kurtzberg was happy and thought my vision would return after possibly the chemo left my body or at least that was what Dr. Kurtzberg wanted as she kept mentioning she wanted the Home Run meaning my vision back and the other issues such as leg-muscle pain and memory to be fixed. I was in Duke 97 days beating the 100 days Dr. K told me to expect to be in the hospital... one of my goals accomplished! Dr. K also told me not to worry about eating solid food through the chemo and transplant for they were feeding me through tubes and IVs and I told Dr. K I would be eating through the chemo no problem and Dr. K sort of laughed and said to me... that was the spirit to have and she loved her patients with the not giving up attitude. I was also told success with anything depends on attitude ( even Lions Goals) so it felt good at the point I was released from Duke with new outlook having won against the disease that was killing me. Dr. K told me she was not giving up on my vision but I had to much life before the disease to give up and she wanted me to not sell my house, new truck, or anything but go to New York and get a guide dog then go to Raleigh blind school and learn to do things without sight. I asked Dr. K at that point that when and if my sight came back would I have to give up my dog she laughed and said no.
I did the mobility training with a cane that was required then applied to guide dog school in Smithtown NY. I applied to Leader Dog and the others but the Guide Dog Foundation accepted me at the time I could go and return just before going to the Governor Morehead school in Raleigh. I got my first guide dog a small, beautiful, and perfect Nelly. I made many mistakes and fussed at Nelly thinking it was her guiding but finally realized it was my ignorance of being blind.
Now onto how I became a Lion. I was finished with blind school and everything back in town as it was holiday time and the doorbell rang. I went to the door and when I answered it was a guy there introduced himself as George Johnson saying he was a Lion and he had brought me a gift. I invited him in and he said he was busy delivering Christmas presents and I told him that when I went to Camp Dogwood they had told us that if we ever got a chance to meet a Lion to thank him-her for the Camp experience so I thanked George for this and told how much fun Camp was! George then asked me would I like to come to a Lions Club meeting some time. I told George I would be happy to. George told me when the next meeting was and he would pick me up. I went to the meeting not knowing anything about a Lions meeting but a bunch of real nice people spoke to me, a bunch of business I had no idea what they were doing so I just listened then they called on a Tail Twister which I had no idea what one of them critters was! The Tail Twister made a lot of noise walking around amongst cheers and boos that really got my interest up! Well, that was about the end of that first meeting and George asked me would I come back, so I did, and I think it was the third time I was asked was anything the Lions Club could do to assist me I told him I didn't think so but how could I help the Lions. I told George if they wanted to sponsor a guide dog they could name it or could offer them a donation or whatever. I think at that point George said he would be right back as he walked away called an immediate Board Meeting. I had no idea what a Board meeting was but sat there just waiting as George had been my ride to the meeting and George returned pretty quickly and asked me could Lions Club sponsor my guide dog paying for Vet care or whatever is needed and did I want to be a Lion? I happily accepted and was so overjoyed feeling accepted! I had questions of course such as transportation and how it worked at the Club but I was assured transportation would never be an issue and I would be involved in everything the Club was.
At this point I not only finally knew what a Lion was but I Was a Lion! I have been a member ever sense and I do like being involved in everything from fundraising to working events and especially enjoying my position as second year Zone Chair and being involved in decisions. I received a 10 year award a year or so back didn't even know I had been a Lion 10 years until a year or so back was presented with a pin! I enjoy when people in my community or Lions Zone contact me asking me for information, help, or advise! Again, I feel useful! My life has changed so much since blindness it was nothing I expected such as my marriage dissolved when my vision did not return as expected, I lost friends and gained brand new ones, my career as an electrician ended but perhaps I have a new purpose in life and that is to be as much help to community, Lions Club as I can be, and to help others with visual impairments or blindness!
It feels good to be a Lion!
We Serve,
Zone 3 Chair Robbie.
Robbie Johnson with his guide dog Murphy and arson dog Phoenix who stays with Stoneville Fire Marshall John Cruise.