Popular Posts

Saturday, November 6, 2010

First talking computer

(taken from the internet)
First Talking Computer
By

Leonard A. McHugh
February 1968, I began my career with the Pennsylvania Department of Highways later to become the Pennsylvania Department of Transportation. Here is where I first started working with the Burroughs Computers. By having the computer punched cards processed by a machine called an interpreter I was able to read them. This machine reads the card punches and types the character near the top. I would do this process two or three times making the print very dark. I would also request all printed output on carbon copied paper. The second copy was always dark. This allowed me to be able to read the program listings as well as locating cards that needed corrections.

As time went by technology improved as my sight simultaneously decreased. In the mid 1970s I acquired a full size close circuit television system that had a magnifying lens. With this I could read listings and some cards when necessary.

As upgrades to the computer were made a computer terminal called a CONRAC replaced the Teletype unit used by the computer operators at the main console. One of the Burroughs' field engineers discovered that he could greatly increase the height and width of the displayed characters. He could also change the contrast of the display to bright white letters on a total black background. I was able to utilize this terminal for many years. Finally in 1981 the decision to convert from Burroughs to IBM computers was made. This modified CONRAC was not compatible with the new IBM computers.

Unfortunately there was not enough contrast with the new IBM terminals to allow me to see the text. I became fearful that I could no longer work in this environment.

It was around this time that Maryland Computer Systems was developing a talking computer. This unit was built from a HP computer and was called ITS, Information Through Speech. Now another problem, like the CONRAC, ITS could not interface with an IBM computer.

I found a company in Boston called Industrial Computer Control. They were developing a converter that would make the HP computer function as an IBM terminal.

The converter cost $9,000 and ITS was $6,000. Success, I was the first person to have a talking terminal on an IBM mainframe computer. (See picture above. )

When all the bugs were worked out a press release was sent to the local papers. This story also was sent to wire services across the country as well as international.

I started receiving telephone calls from around the world. I had some long and interesting calls from England, Ireland, Canada as well as many from around the country.

I ended up giving many demonstrations of this new technology to other employers and caseworkers of blind clients.

Some of the phone calls were very interesting, although one particularly upset me. A woman called from Bell of Michigan. She informed me that she was reading a newspaper article about my talking computer. She went on to ask if I would like to move to Michigan and work for her. She then said "we have many government contracts and must hire some disabled people. It would be nice to hire someone who can do something." This upset me so much that I found myself speechless. If I remember correctly, I slammed down the phone.

Another call that was very interesting was from a man asking some very detailed questions. He went on to tell me that they were thinking of purchasing one hundred units. I asked him if they were hiring, it sounded like a great place to work. He went on telling me "No, they are not." He explained that they have a lot of people in a warehouse that can't read. He thought that these talking computers could solve their problem.

Saturday, September 25, 2010

White Cane Safety day is coming

Ida Rieu teachers’ resource centre

Ida Rieu Teachers’ Resource Centre is organizing an awareness program on WHITE CANE SAFETY DAY. White cane day is celebrated all over the world on 15th October each year.

The program is designed for anyone who is interested in helping visually impaired persons of Pakistan.
The program will deal on
1. About white cane.
2. The importance of white cane.
3. Role of society in improving mobility of vi person.

Venue: Dr. Mohammad Hussain Panjwani school complex for the blind at the premises of Ida rieu welfare organization.

Date: Thursday 14, October 2010. Timings: 10 am sharp till 12 pm.

There is no registration fee,

Seats are limited so kindly book your seat.

For confirmation contact Shazia Hasan Program Manager Ida rieu teachers’ resource centre

E mail: s_hasan21@hotmail.com.

Wednesday, July 7, 2010

visually impair person can drive in future

A car for blind drivers coming next year The National Federation of the Blind and Virginia Tech say they hopeto demonstrate a prototype equipped with technology that helps asightless personto get behind the wheel in 2011. The technology, called non-visualinterfaces, will guide its driver through traffic by transmittinginformation aboutnearby vehicles or objects. Vibrating gloves or streams of compressed air directed towards thedriver are among the options for communicating the information neededto avoid collisionsand reach a destination. Advocates for the blind describe the project as a “moon shot”, drawingparallels with President John F. Kennedy’s pledge to land a man on themoon. “We’re exploring areas that have previously been regarded asunexplorable,” said Mark Maurer, president of the National Federationof the Blind. “We’removing away from the theory that blindness ends the capacity of humanbeings to make contributions to society.” Here’s how the vehicle works: The steering wheel is hooked up to adistance monitor that gathers information from laser range finders. Voice software is used to direct the driver every second on exactlyhow far to turn the steering wheel. For example, the monitor will tellthe driver “turnleft three clicks.” As the driver does that, the monitor makes three clicking noises. Avibrating vest provides cues to follow when accelerating anddecelerating. The vest will vibrate in various spots–the back, abdominal area andthe shoulders–to relay a variety of commands. When a driver needs to“Hit the brakes!”the entire vest will vibrate to a fare-thee-well!
)taken from the internet)

Friday, June 25, 2010

great news for vi persons

(taken from the internet)
June 18 (Bloomberg) -- Patients blinded in one or both eyes bychemical burns regained their vision after healthy stem cells wereextracted from their eyes and reimplanted, according to a report byItalian researchers at a scientific meeting. The tissue was drawn from the limbus, an area at the junction of thecornea and white part of the eye. It was grown on a fibrous tissue,then layered onto the damaged eyes. The cells grew into healthycorneal tissue, transforming disfigured, opaque eyes into functioningones with normal appearance and color, said researchers led byGraziella Pellegrini of the University of Modena’s Center forRegenerative Medicine. The stem-cell treatment restored sight to more than three- quarters ofthe 112 patients treated, Pellegrini said yesterday in a presentationat the International Society for Stem Cell Research meeting. Thepatients were followed for an average of three years and some for aslong as a decade, Pellegrini said. “The patients, they are happy, even the partial successes,” she saidin an interview at the meeting in San Francisco. “We have a couple ofpatients who were blind in both eyes. Can you imagine for thesepatients the change in their quality of life?” The work was praised by Ivan Schwab, an ophthalmology professor andstem cell researcher at the University of California, Davis, who hastreated patients in clinical trials with a procedure based onPellegrini’s work. While his patients improved for a time, thebenefits didn’t endure, he said in a June 15 telephone interview.Pellegrini’s patients appear to have long-term improvement, he said. “The powerful part of her work is she has such long-term follow-up,”Schwab said. Many of the patients she treated had been blind for years as result oftissue and blood vessels growing over damaged parts of the eye. Somehad been through failed surgeries and alternative treatments.Pellegrini estimated 1,000 to 2,000 patients in Europe suffer fromburns with chemicals such as bleach or industrial solvents and maybenefit from the procedure. The key to success is to be certain that when the stem cells extractedfrom the limbus are grown in culture they have the right mix of stemcells and the differentiated cells that form the corneal tissue,Pellegrini said. If there are too few stem cells in the transplant,the improvement won’t last because there will be no reservoir to formthe new corneal cells needed with the normal recycling of cells overtime, she said. The procedure succeeded after a single transplant in 69 percent ofcases. A second procedure was performed on some patients, boosting thesuccess rate to 77 percent, she said. The procedure was deemed apartial success in 13 percent of cases and a failure in 10 percent,she said. Depending on the depth of the injury, some patients regained sight inas little as two months, Pellegrini said. Others with deeper injuriesneeded a second procedure and waited a year before sight was restored,she said. The applications of the work may extend to other organs, Schwab said. “This is bigger than just the surface of the eye,” he said. “She maybe making a model for how to regenerate livers or other organs.”

Wednesday, April 14, 2010

Golfer or not!

The Amazing Story of courage and determination


And we complain about anything....

GOLFER OR NOT: WATCH THIS.

AMAZING VIDEO

Whether you're a golfer or not, this is something to see
Near the end of video watch how this guy tees up his ball.

http://www.thegolfchannel.com/golf-videos/meet-butch-lumpkin-9477/?ref=26000

Tuesday, March 30, 2010

always be calm

ANGER MANAGEMENT


“Anger is the key that opens the door to all kinds of vices” – Imam Hassan Askari (a.s)
A man came out of his home to admire his new car. To his horror his little son was happily hammering dents into the shiny paint. The man ran to his son, knocked him away, and hammered the little boy’s hand into pulp as punishment. When the father calmed down, he rushed his son to the hospital. Upon taking a closer look, the man saw that his little boy had etched the words, “I LOVE YOU DAD” on the car.
Although the doctor tried desperately to save the crushed bones, he finally had to amputate the fingers from both the boy’s hands. When the boy woke up from the surgery & saw his bandaged stubs, he innocently said “Dad, I’m sorry about your car.” Then he asked, “But when are my fingers going to grow back?’
Harmful effects of anger:
· Increases frustration
· Prevents one from finding a solution to the problem.
· Makes one physically sick
· It’s a cause for break in human relationships
· Anger is responsible for one of the most depraved human behavior – child & wife abuse.
Anger Management:
· "And if an evil suggestion comes to you from shaytan, then seek refuge in Allah, He is hearing and knowing." (Qur'an 7:200) Therefore when one is angry he should immediately say “`ûdhû billâhi min ash-shaytân ir rajîm”

· Do wudu The holy Prophet (s.a.w.w) has said "Anger from shaytan, shaytan from fire; fire is put out by water; so when angry do wudu"
· Change body position. Our Prophet (peace and blessings be unto him) said, "If one of you gets angry while standing, he should sit. If he is still angry, he should lie down."
· Divert attention away from the cause of anger and participate in strenuous physical activity aiming at letting steam out and relaxing muscles
· Be silent, don't speak. The Prophet (peace and blessings be unto him) said, "Teach, simplify, don't complicate and if you get angry be silent
· Try to pinpoint the exact reasons why you feel angry. Once you identified the problem, consider coming up with different strategies on how to remedy the situation.
· Forgive & forget. Remember the rewards and virtues of patience, mercy, and forgiveness. The Qur'an 42:47 speaks of forgiveness, "And those who avoid major sins and immoralities and when angry they forgive."

Sunday, March 28, 2010

Hope is the key of success

(taken from the internet)
a Guy who got into Google



God has always been planning things for me'
July 28, 2008
Shobha Warrier

Naga Naresh Karutura has just passed out of IIT Madras in Computer Science and has joined Google in Bangalore.

You may ask, what's so special about this 21-year-old when there are hundreds of students passing out from various IITs and joining big companies like Google?

Naresh is special. His parents are illiterate. He has no legs and moves around in his powered wheel chair. (In fact, when I could not locate his lab, he told me over the mobile phone, 'I will come and pick you up'. And in no time, he was there to guide me)

Ever smiling, optimistic and full of spirit; that is Naresh. He says, "God has always been planning things for me. That is why I feel I am lucky."

Read why Naresh feels he is lucky.

Childhood in a village

I spent the first seven years of my life in Teeparru, a small village in Andhra Pradesh, on the banks of the river Godavari. My father Prasad was a lorry driver and my mother Kumari, a house wife. Though they were illiterate, my parents instilled in me and my elder sister (Sirisha) the importance of studying.

Looking back, one thing that surprises me now is the way my father taught me when I was in the 1st and 2nd standards. My father would ask me questions from the text book, and I would answer them. At that time, I didn't know he could not read or write but to make me happy, he helped me in my studies!

Another memory that doesn't go away is the floods in the village and how I was carried on top of a buffalo by my uncle. I also remember plucking fruits from a tree that was full of thorns.

I used to be very naughty, running around and playing all the time with my friends.. I used to get a lot of scolding for disturbing the elders who slept in the afternoon. The moment they started scolding, I would run away to the fields!

I also remember finishing my school work fast in class and sleeping on the teacher's lap!

January 11, 1993, the fateful day

On the January 11, 1993 when we had the sankranti holidays, my mother took my sister and me to a nearby village for a family function. From there we were to go with our grandmother to our native place. But my grandmother did not come there. As there were no buses that day, my mother took a lift in my father's friend's lorry. As there were many people in the lorry, he made me sit next to him, close to the door.

It was my fault; I fiddled with the door latch and it opened wide throwing me out. As I fell, my legs got cut by the iron rods protruding from the lorry. Nothing happened to me except scratches on my legs.

The accident had happened just in front of a big private hospital but they refused to treat me saying it was an accident case. Then a police constable who was passing by took us to a government hospital.

First I underwent an operation as my small intestine got twisted. The doctors also bandaged my legs. I was there for a week. When the doctors found that gangrene had developed and it had reached up to my knees, they asked my father to take me to a district hospital. There, the doctors scolded my parents a lot for neglecting the wounds and allowing the gangrene to develop. But what could my ignorant parents do?

In no time, both my legs were amputated up to the hips.

I remember waking up and asking my mother, where are my legs? I also remember that my mother cried when I asked the question. I was in the hospital for three months.

Life without legs

I don't think my life changed dramatically after I lost both my legs. Because all at home were doting on me, I was enjoying all the attention rather than pitying myself. I was happy that I got a lot of fruits and biscuits.



'I never wallowed in self-pity'
July 28, 2008

The day I reached my village, my house was flooded with curious people; all of them wanted to know how a boy without legs looked. But I was not bothered; I was happy to see so many of them coming to see me, especially my friends!
All my friends saw to it that I was part of all the games they played; they carried me everywhere.

God's hand

I believe in God. I believe in destiny. I feel he plans everything for you. If not for the accident, we would not have moved from the village to Tanuku, a town. There I joined a missionary school, and my father built a house next to the school. Till the tenth standard, I studied in that school.

If I had continued in Teeparu, I may not have studied after the 10th. I may have started working as a farmer or someone like that after my studies. I am sure God had other plans for me.

My sister, my friend

When the school was about to reopen, my parents moved from Teeparu to Tanuku, a town, and admitted both of us in a Missionary school. They decided to put my sister also in the same class though she is two years older. They thought she could take care of me if both of us were in the same class. My sister never complained.

She would be there for everything. Many of my friends used to tell me, you are so lucky to have such a loving sister. There are many who do not care for their siblings.

She carried me in the school for a few years and after a while, my friends took over the task. When I got the tricycle, my sister used to push me around in the school.

My life, I would say, was normal, as everyone treated me like a normal kid. I never wallowed in self-pity. I was a happy boy and competed with others to be on top and the others also looked at me as a competitor.

Inspiration

I was inspired by two people when in school; my Maths teacher Pramod Lal who encouraged me to participate in various local talent tests, and a brilliant boy called Chowdhary, who was my senior.

When I came to know that he had joined Gowtham Junior College to prepare for IIT-JEE, it became my dream too. I was school first in 10th scoring 542/600.

Because I topped in the state exams, Gowtham Junior College waived the fee for me. Pramod Sir's recommendation also helped. The fee was around Rs 50,000 per year, which my parents could never afford.

Moving to a residential school

Living in a residential school was a big change for me because till then my life centred around home and school and I had my parents and sister to take care of all my needs. It was the first time that I was interacting with society. It took one year for me to adjust to the new life.

There, my inspiration was a boy called K K S Bhaskar who was in the top 10 in IIT-JEE exams. He used to come to our school to encourage us. Though my parents didn't know anything about Gowtham Junior School or IIT, they always saw to it that I was encouraged in whatever I wanted to do.. If the results were good, they would praise me to the skies and if bad, they would try to see something good in that. They did not want me to feel bad.

They are such wonderful supportive parents.

Life at IIT- Madras

Though my overall rank in the IIT-JEE was not that great (992), I was 4th in the physically handicapped category. So, I joined IIT, Madras to study Computer Science.

Here, my role model was Karthik who was also my senior in school. I looked up to him during my years at IIT- Madras.

He had asked for attached bathrooms for those with special needs before I came here itself. So, when I came here, the room had attached bath. He used to help me and guide me a lot when I was here.

I evolved as a person in these four years, both academically and personally. It has been a great experience studying here. The people I was interacting with were so brilliant that I felt privileged to sit along with them in the class. Just by speaking to my lab mates, I gained a lot..

'There are more good people in society than bad ones'

July 28, 2008

Words are inadequate to express my gratitude to Prof Pandurangan and all my lab mates; all were simply great. I was sent to Boston along with four others for our internship by Prof Pandurangan. It was a great experience.

Joining Google R&D

I did not want to pursue PhD as I wanted my parents to take rest now.

Morgan Stanley selected me first but I preferred Google because I wanted to work in pure computer science, algorithms and game theory.

I am lucky

Do you know why I say I am lucky?

I get help from total strangers without me asking for it. Once after my second year at IIT, I with some of my friends was travelling in a train for a conference. We met a kind gentleman called Sundar in the train, and he has been taking care of my hostel fees from then on.

I have to mention about Jaipur foot. I had Jaipur foot when I was in 3rd standard. After two years, I stopped using them. As I had almost no stems on my legs, it was very tough to tie them to the body. I found walking with Jaipur foot very, very slow. Sitting also was a problem. I found my tricycle faster because I am one guy who wants to do things faster.

One great thing about the hospital is, they don't think their role ends by just fixing the Jaipur foot; they arrange for livelihood for all. They asked me what help I needed from them. I told them at that time, if I got into an IIT, I needed financial help from them. So, from the day I joined IIT, Madras, my fees were taken care of by them. So, my education at the IIT was never a burden on my parents and they could take care of my sister's Nursing studies.

Surprise awaited me at IIT

After my first year, when I went home, two things happened here at the Institute without my knowledge.

I got a letter from my department that they had arranged a lift and ramps at the department for me. It also said that if I came a bit early and checked whether it met with my requirements, it would be good.

Second surprise was, the Dean, Prof Idichandy and the Students General Secretary, Prasad had located a place that sold powered wheel chairs. The cost was Rs 55,000. What they did was, they did not buy the wheel chair; they gave me the money so that the wheel chair belonged to me and not the institute.

My life changed after that. I felt free and independent.

That's why I say I am lucky. God has planned things for me and takes care of me at every step.

The world is full of good people

I also feel if you are motivated and show some initiative, people around you will always help you. I also feel there are more good people in society than bad ones. I want all those who read this to feel that if Naresh can achieve something in life, you can too. (via : email)

Wednesday, March 17, 2010

brain port

How BrainPort Works
by Julia Layton

Browse the article How BrainPort WorksIntroduction to How the BrainPort Works



The BrainPort technology maniuplates the brain's sensory input and can allow the blind to see. See more BrainPort images.


A blind woman sits in a chair holding a video camera focused on a scientist sitting in front of her. She has a device in her mouth, touching her tongue, and there are wires running from that device to the video camera. The woman has been blind since birth and doesn't really know what a rubber ball looks like, but the scientist is holding one. And when he suddenly rolls it in her direction, she puts out a hand to stop it. The blind woman saw the ball. Through her tongue.
Well, not exactly through her tongue, but the device in her mouth sent visual input through her tongue in much the same way that seeing individuals receive visual input through the eyes. In both cases, the initial sensory input mechanism -- the tongue or the eyes -- sends the visual data to the brain, where that data is processed and interpreted to form images. What we're talking about here is electrotactile stimulation for sensory augmentation or substitution, an area of study that involves using encoded electric current to represent sensory information -- information that a person cannot receive through the traditional channel -- and applying that current to the skin, which sends the information to the brain. The brain then learns to interpret that sensory information as if it were being sent through the traditional channel for such data. In the 1960s and '70s, this process was the subject of ground-breaking research in sensory substitution at the Smith-Kettlewell Institute led by Paul Bach-y-Rita, MD, Professor of Orthopedics and Rehabilitation and Biomedical Engineering at the University of Wisconsin, Madison. Now it's the basis for Wicab's BrainPort technology (Dr. Bach-y-Rita is also Chief Scientist and Chairman of the Board of Wicab).


Vibration
Electricity isn't the only type of stimulation used in high-tech sensory substitution devices. There are devices that use"vibrotactile" stimulation, among other means, to send information to the brain through an alternate sensory channel. In a vibrotactile stimulation device, encoded sensory signals are applied to the skin by one or more vibrating pins. Tactaid, an auditory substitution device, uses this type of technology.

Most of us are familiar with the augmentation or substitution of one sense for another. Eyeglasses are a typical example of sensory augmentation. Braille is a typical example of sensory substitution -- in this case, you're using one sense, touch, to take in information normally intended for another sense, vision. Electrotactile stimulation is a higher-tech method of receiving somewhat similar (although more surprising) results, and it's based on the idea that the brain can interpret sensory information even if it's not provided via the "natural" channel. Dr. Bach-y-Rita puts it this way:

... we do not see with the eyes; the optical image does not go beyond the retina where it is turned into spatio-temporal nerve patterns of [impulses] along the optic nerve fibers. The brain then recreates the images from analysis of the impulse patterns.
The multiple channels that carry sensory information to the brain, from the eyes, ears and skin, for instance, are set up in a similar manner to perform similar activities. All sensory information sent to the brain is carried by nerve fibers in the form of patterns of impulses, and the impulses end up in the different sensory centers of the brain for interpretation. To substitute one sensory input channel for another, you need to correctly encode the nerve signals for the sensory event and send them to the brain through the alternate channel. The brain appears to be flexible when it comes to interpreting sensory input. You can train it to read input from, say, the tactile channel, as visual or balance information, and to act on it accordingly. In JS Online's "Device may be new pathway to the brain," University of Wisconsin biomedical engineer and BrainPort technology co-inventor Mitch Tyler states, "It's a great mystery as to how that process takes place, but the brain can do it if you give it the right information."

In the next section, we'll look more closely at the concepts of electrotactile stimulation.



Concepts of Electrotactile Stimulation

The concepts at work behind electrotactile stimulation for sensory substitution are complex, and the mechanics of implementation are no less so. The idea is to communicate non-tactile information via electrical stimulation of the sense of touch. In practice, this typically means that an array of electrodes receiving input from a non-tactile information source (a camera, for instance) applies small, controlled, painless currents (some subjects report it feeling something like soda bubbles) to the skin at precise locations according to an encoded pattern. The encoding of the electrical pattern essentially attempts to mimic the input that would normally be received by the non-functioning sense. So patterns of light picked up by a camera to form an image, replacing the perception of the eyes, are converted into electrical pulses that represent those patterns of light. When the encoded pulses are applied to the skin, the skin is actually receiving image data. According to Dr. Kurt Kaczmarek, BrainPort technology co-inventor and Senior Scientist with the University of Wisconsin Department of Orthopedics and Rehabilitation Medicine, what happens next is that "the electric field thus generated in subcutaneous tissue directly excites the afferent nerve fibers responsible for normal, mechanical touch sensations." Those nerve fibers forward their image-encoded touch signals to the tactile-sensory area of the cerebral cortex, the parietal lobe.



Mouse-over the part labels of the brain to see where those parts are located.


Under normal circumstances, the parietal lobe receives touch information,
the temporal lobe receives auditory information, the occipital lobe receives
vision information and the cerebellum receives balance information.
(The frontal lobe is responsible for all sorts of higher brain functions,
and the brain stem connects the brain to the spinal cord.)



Within this system, arrays of electrodes can be used to communicate non-touch information through pathways to the brain normally used for touch-related impulses. It's a fairly popular area of study right now, and researchers are looking at endless ways to utilize the apparent willingness of the brain to adapt to cross-sensory input. Scientists are studying how to use electrotactile stimulation to provide sensory information to the vision impaired, the hearing impaired, the balance impaired and those who have lost the sense of touch in certain skin areas due to nerve damage. One particularly fascinating aspect of the research focuses on how to quantify certain sensory information in terms of electrical parameters -- in other words, how to convey "tactile red" using the characteristics of electricity.

This is a field of scientific study that has been around for nearly a century, but it has picked up steam in the last few decades. The miniaturization of electronics and increasingly powerful computers have made this type of system a marketable reality instead of just a really impressive laboratory demonstration. Enter BrainPort, a device that uses electrotactile stimulation to transmit non-tactile sensory information to the brain. BrainPort uses thetongue as a substitute sensory channel. In the next section, we'll get inside BrainPort.

BrainPort


Photo courtesy Wicab, Inc.
BrainPort balance device

Scientists have been studying electrotactile presentation of visual information since the early 1900s, at least. These research setups typically used a camera to set current levels for a matrix of electrodes that spatially corresponded to the camera's light sensors. The person touching the matrix could visually perceive the shape and spatial orientation of the object on which the camera was focused. BrainPort builds on this technology and is arguably more streamlined, controlled and sensitive than the systems that came before it.
For one thing, BrainPort uses the tongue instead of the fingertips, abdomen or back used by other systems. The tongue is more sensitive than other skin areas -- the nerve fibers are closer to the surface, there are more of them and there is no stratum corneum (an outer layer of dead skin cells) to act as an insulator. It requires less voltage to stimulate nerve fibers in the tongue -- 5 to 15 volts compared to 40 to 500 volts for areas like the fingertips or abdomen. Also, saliva contains electrolytes, free ions that act as electrical conductors, so it helps maintain the flow of current between the electrode and the skin tissue. And the area of the cerebral cortex that interprets touch data from the tongue is larger than the areas serving other body parts, so the tongue is a natural choice for conveying tactile-based data to the brain.

Wicab is currently seeking FDA approval for a balance-correction BrainPort application. A person whose vestibular system, the overall balance mechanism that begins in the inner ears, is damaged has little or no sense of balance -- in severe cases, he may have to grip the wall to make it down a hallway, or be unable to walk at all. Some inner-ear disorders include bilateral vestibular disorders (BVD), acoustic neuroma and Meniere's disease, and the sense of balance can also be affected by common conditions like migraines and strokes. The BrainPort balance device can help people with balance problems to retrain their brains to interpret balance information coming from their tongue instead of their inner ear.



Photo courtesy Wicab, Inc.
BrainPort balance components simplified


An accelerometer is a device that measures, among other things, tilt with respect to the pull of gravity. The accelerometer on the underside of the 10-by-10 electrode array transmits data about head position to the CPU through the communication circuitry. When the head tilts right, the CPU receives the "right" data and sends a signal telling the electrode array to provide current to the right side of the wearer's tongue. When the head tilts left, the device buzzes the left side of the tongue. When the head is level, BrainPort sends a pulse to the middle of the tongue. After multiple sessions with the device, the subject's brain starts to pick up on the signals as indicating head position -- balance information that normally comes from the inner ear -- instead of just tactile information.

Wicab conducted a clinical trial with the balance device in 2005 with 28 subjects suffering from bilateral vestibular disorders (BVD). After training on BrainPort, all of the subjects regained their sense of balance for a period of time, sometimes up to six hours after each 20-minute BrainPort session. They could control their body movements and walk steadily in a variety of environments with a normal gait and with fine-motor control. They experienced muscle relaxation, emotional calm, improved vision and depth perception and normalized sleep patterns.

In the next section we'll look at the BrainPort vision device.



The BrainPort Vision Device

Test results for the BrainPort vision device are no less encouraging, although Wicab has not yet performed formal clinical trials with the setup. According to the University of Washington Department of Ophthalmology, 100 million people in the United States alone suffer from visual impairment. This might be age-related, including cataracts, glaucoma and macular degeneration, from diseases like trachoma, diabetes or HIV, or the result of eye trauma from an accident. BrainPort could provide vision-impaired people with limited forms of sight.



Photo courtesy Wicab, Inc.
Prototype BrainPort vision components simplified


To produce tactile vision, BrainPort uses a camera to capture visual data. The optical information -- light that would normally hit the retina -- that the camera picks up is in digital form, and it uses radio signals to send the ones and zeroes to the CPU for encoding. Each set of pixels in the camera's light sensor corresponds to an electrode in the array. The CPU runs a program that turns the camera's electrical information into a spatially encoded signal. The encoded signal represents differences in pixel data as differences in pulse characteristics such as frequency, amplitude and duration. Multidimensional image information takes the form of variances in pulse current or voltage, pulse duration, intervals between pulses and the number of pulses in a burst, among other parameters. According to U.S. Patent 6,430,450, licensed to Wicab for the BrainPort application:

To the extent that a trained user may simultaneously distinguish between multiple of these characteristics of amplitude, width and frequency, the pulses may convey multidimensional information in much the same way that the eye perceives color from the independent stimulation of different color receptors.
The electrode array receives the resulting signal via the stimulation circuitry and applies it to the tongue. The brain eventually learns to interpret and use the information coming from the tongue as if it were coming from the eyes.
After training in laboratory tests, blind subjects were able to perceive visual traits like looming, depth, perspective, size and shape. The subjects could still feel the pulses on their tongue, but they could also perceive images generated from those pulses by their brain. The subjects perceived the objects as "out there" in front of them, separate from their own bodies. They could perceive and identify letters of the alphabet. In one case, when blind mountain climber Erik Weihenmayer was testing out the device, he was able to locate his wife in a forest. One of the most common questions at this point is, "Are they really seeing?" That all depends on how you define vision. If seeing means you can identify the letter "T" somewhere outside yourself, sense when that "T" is getting larger, smaller, changing orientation or moving farther away from your own body, then they're really seeing. One study that conducted PET brain scans of congenitally blind people while they were using the BrainPort vision device found that after several sessions with BrainPort, the vision centers of the subjects' brains lit up when visual information was sent to the brain through the tongue. If "seeing" means there's activity in the vision center of the cerebral cortex, then the blind subjects are really seeing.

The BrainPort test results are somewhat astonishing and lead many to wonder about the scope of applications for the technology. In the next section, we'll see which BrainPort applications Wicab is currently focusing on in clinical trials, what other applications it foresees for the technology and how close it is to commercially launching a consumer-friendly version of the device.

Current and Potential Applications


Photo courtesy Wicab, Inc.
BrainPort Balance Device

While the full spectrum of BrainPort applications has yet to realized, the device has the potential to lessen an array of sensory limitations and to alleviate the symptoms of a variety of disorders. Just a few of the current or foreseeable medical applications include:
providing elements of sight for the visually impaired
providing sensory-motor training for stroke patients
providing tactile information for a part of the body with nerve damage
alleviating balance problems, posture-stability problems and muscle rigidity in people with balance disorders and Parkinson's disease
enhancing the integration and interpretation of sensory information in autistic people
Beyond medical applications, Wicab has been exploring potential military uses with a grant from the Defense Advanced Research Projects Agency (DARPA). The company is looking into underwater applications that could provide the Navy SEALs with navigation information and orientation signals in dark, murky water (this type of setup could ultimately find a major commercial market with recreational SCUBA divers). The BrainPort electrodes would receive input from a sonar device to provide not only directional cues but also a visual sense of obstacles and terrain. Military-navigation applications could extend to soldiers in the field when radio communication is dangerous or impossible or when their eyes, ears and hands are needed to manage other things -- things that might blow up. BrainPort may also provide expanded information for military pilots, such as a pulse on the tongue to indicate approaching aircraft or to indicate that they must take immediate action. With training, that pulse on their tongue could elicit a faster reaction time than a visual cue from a light on the dashboard, since the visual cue must be processed by the retina before it's forwarded to the brain for interpretation.
Other potential BrainPort applications include robotic surgery. The surgeon would wear electrotactile gloves to receive tactile input from robotic probes inside someone's chest cavity. In this way, the surgeon could feel what he's doing as he controls the robotic equipment. Race car drivers might use a version of BrainPort to train their brains for faster reaction times, and gamers might use electrotactile feedback gloves or controllers to feel what they're doing in a video game. A gaming BrainPort could also use a tactile-vision process to let gamers perceive additional information that isn't displayed on the screen.

BrainPort is currently conducting a second round of clinical trials as it works its way through the FDA approval process for the balance device. The company estimates a commercial release in late 2006, with a roughly estimated selling price of $10,000 per unit.

Already more streamlined than any previous setup using electrotactile stimulation for sensory substitution, BrainPort envisions itself even smaller and less obtrusive in the future. In the case of the balance device, all of the electronics in the handheld part of the system might fit into a discreet mouthpiece. A dental-retainer-like unit would house a battery, the electrode array and all of the microelectronics necessary for signal encoding and transmitting. In the case of the BrainPort vision device, the electronics might be completely embedded in a pair of glasses along with a tiny camera and radio transmitter, and the mouthpiece would house a radio receiver to receive encoded signals from the glasses. It's not exactly a system on a chip, but give it 20 years -- we might be seeing a camera the size of a grain of rice embedded in people's foreheads by then.

For more information on BrainPort and related topics, check out the links on the next page.



Lots More Information

Related HowStuffWorks Articles


How the Brain Works
How Corrective Lenses Work
How Digital Cameras Work
How Light Works
How Vision Works
More Great Links


JS Online: Device may be new pathway to the brain
Journal of Rehabilitation Research and Development: Form perception with a 49-point electrotactile stimulus array on the tongue: A technical note
Seeing with Sound: The vOICe
University of Wisconsin: Tongue Display Technology
U.S. Patent #6,430,450: Tongue placed tactile output device
Wicab, Inc.
Sources


Bach-y-Rita, Paul et al. "Form perception with a 49-point electrotactile stimulus array on the tongue: A technical note." Journal of Rehabilitation Research and Development, 1998.
http://kaz.med.wisc.edu/Publications/1998-BachyRita-JRRD-Tongue.pdf
Blakeslee, Sandra. "New Tools to Help Patients Reclaim Damaged Senses." New York Times, Nov. 23, 2004.
http://www.goupstate.com/apps/pbcs.dll/article?AID=/20041123/ZNYT05/411230391/1051/NEWS01
Kaczmarek, Kurt, Ph.D. "Tongue Display Technology." University of Wisconsin, Aug. 18, 2005.
http://kaz.med.wisc.edu/Publicity/Synopsis.html
Kupers, Ron et al. "Activation of visual cortex by electrotactile stimulation of the tongue in early-blind subjects." Human Brain Mapping 2003.
http://208.164.121.55/hbm2003/abstract/abstract1557.htm
Manning, Joe. "Device may be new pathway to the brain." JS Online, Dec. 7, 2004.
http://www.jsonline.com/story/index.aspx?id=282145
Phone interview with Kurt Kaczmarek, Ph.D., Senior Scientist, University of Wisconsin Department of Orthopedics and Rehabilitation Medicine. July 7, 2006.
Ptito, Maurice et al. "Cross-modal plasticity revealed by electrotactile stimulation of the tongue in the congenitally blind." Brain, 2005.
http://brain.oxfordjournals.org/cgi/content/abstract/128/3/606
U.S. Patent #6,430,450. "Tongue placed tactile output device."
Wicab, Inc.
http://www.wicab.com/
"Wicab to present BrainPort at Boston conference." WTN News. Oct. 4, 2005.
http://wistechnology.com/printarticle.php?id=2319

Wednesday, March 10, 2010

proud to be a teacher!

WHAT DO TEACHERS MAKE....?

The dinner guests were sitting around the table discussing life.

One man, a CEO, decided to explain the problem with education. He argued, "What's a kid going to learn from someone who decided his best option in life was to become a teacher?"

To stress his point he said to another guest; "You're a teacher, Barbara. Be honest. What do you make?"

Barbara, who had a reputation for honesty and frankness replied, "You want to know what I make? (She paused for a second, and then began...)

"Well, I make kids work harder than they ever thought they could.

I make a C+ feel like the Congressional Medal of Honor winner.

I make kids sit through 40 minutes of class time when their parents CAN'T make them sit for 5 without an I Pod, Game Cube or movie rental.

You want to know what I make? (She paused again and looked at each and every person at the table)

I make kids wonder.

I make them question.

I make them apologize and mean it.

I make them have respect and take responsibility for their actions.

I teach them to write and then I make them write. Keyboarding ISN'T EVERYTHING.

I make them read, read, read.

I make them show all their work in maths. They use their God given brain, not the man-made calculator.

I make my students from other countries learn everything they need to know about English while preserving their unique cultural identity.

I make my classroom a place where all my students feel safe.

Finally, I make them understand that if they use the gifts they were given, work hard, and follow their hearts, they can succeed in life (Barbara paused one last time and then continued.)

Then, when people try to judge me by what I make, with me knowing money isn't everything, I can hold my head up high and pay no attention because they are ignorant.

You want to know what I make?

I MAKE A DIFFERENCE.

What do you make Mr. CEO?

His jaw dropped, he went silent.




THIS IS WORTH SENDING TO EVERY TEACHER, EVERY CEO, AND EVERY PERSON YOU KNOW.

Even all your personal teachers like mothers, fathers, brothers, sisters, coaches, and others.

A profound answer!!!

Friday, February 26, 2010

this is life

This is UTUBE clip of the person who has no legs and limbs (arms) and lives a full life



Subject: i love living life and i m happy(inspirational)


http://www.youtube.com/watch?v=H8ZuKF3dxCY&feature=related

Tuesday, February 2, 2010

oratio

taken from internet
Oratio for BlackBerry ® is now availableLongueuil, QC, Canada and Barcelona, Spain, February 1st, 2010 –HumanWare and Code Factory are pleased to announce that Oratio forBlackBerry(R) smartphones is now available for purchase. Formallyknown as Orator for BlackBerry smartphones, Oratio is the first screenreader software solution that enables visually impaired users toaccess and operate BlackBerry smartphones using state of the artText-To-Speech technology to convert the visual information displayedon the BlackBerry smartphone screen into a intuitive speech output.This enables its users to use BlackBerry smartphones to increase theirindependence and productivity in today’s competitive world.The name was changed from Orator to Oratio to avoid any confusionwith an existing product called Orator being manufactured by atelecommunications company in the USA. “Although we got accustomed tothe name Orator for BlackBerry in the last few months, Oratio is lessgeneric and provides a more personalized name and sound for theproduct” says Michel Pepin, Product Manager at HumanWare. Availability:Oratio will first be released in North America in English, supportingthe BlackBerry Curve 8520 smartphone from AT&T, available throughonline purchasing from www.oratio4bb.com for $449 US for a singlelicense. Support for additional BlackBerry smartphone models andlanguages will be available in subsequent versions of Oratio.Oratio is the product of the joint collaborative efforts betweenHumanWare, Code Factory, the leading provider of screen readertechnology and maker of Mobile Speak, and Research In Motion (RIM),the maker of the award winning portfolio of BlackBerry products andsolutions. Oratio users will experience more freedom and independencein their activities with the ability to stay connected anytime,anywhere. Users will also experience greater flexibility to managetheir day-to-day activities in ways that are most convenient for them,increase their productivity and achieve more by quickly andefficiently accessing information they need.Oratio also provides employers with an accommodation solution forblind and visually impaired employees that leverages an organization'sexisting investment in BlackBerry infrastructure and technologies.Feature rich, through its easy to use menu and efficient shortcutkeys, Oratio will provide users with:Intuitive and familiar audio user interface.Easy-to-use customization options for frequently used settings.Auto start mode when the device turns on.Different verbosity levels to allow users to define the amount ofinformation provided.Keyboard echo settings for text entry.Easy to use command structure.Support for BlackBerry smartphone's core applications.BlackBerry smartphones offer multiple applications essential in abusiness environment. Oratio was designed to support the coreapplication found on the BlackBerry smatrtphones allowing visuallyimpaired users to:Manage instant messaging, emails, SMS and MMS.Make and receive calls with access to caller ID on incoming calls.Manage contact list and call log.Schedule appointments and tasks with alarms and reminders.Access to the phone's settings, ring tones, speed dials and voice tags.“Oratio is the first screen reader solution for a JavaME operatingsoftware (O/S). While this first release version may not answer eachspecific individual user's needs, HumanWare, with the jointcollaboration of RIM and Code Factory, remain dedicated and committedto the future development growth of the product. We invite Oratiousers to share their experiences with the product. This will provideus with directions on how to improve their BlackBerry smartphoneexperience” says Michel Pepin. “Our goal is to provide equal access tovisually impaired users by enabling them to access and operateBlackBerry devices in a manner that is functionally equivalent tosolutions offered to sighted BlackBerry users”.

Wednesday, January 27, 2010

new technology

(taken from the internet)
otes)

During the past ten years, evolutions in many fields of technology have influenced the lives of all of us, and especially the world's blind population. Advancements in speech synthesis have led to the usability of many different operating systems, Linux among them. One of these programs, and by far one of the best, is a screen review package called Speakup, written by Kirk Reiser with assistance from the user community. Speakup is unique in the sense that it integrates seamlessly into the kernel, allowing it to talk from startup to shutdown, and even to debug kernel errors, which I can testify to from personal experience. It also makes the installation of a Linux system much easier, because one does not usually require a serial console or sighted assistance to complete the installation process.

A screen review package is a program that takes the text displayed on the screen, and outputs it in spoken words. The actual speaking is done by a speech synthesizer, which can come in either hardware or software versions. Hardware synthesizers are either external boxes with headphone jacks and volume knobs that plug in to your computer via serial or USB ports, or ISA or PCI cards that have an output jack for a speaker or headphones. Software synthesizers are actual software programs that handle all the processing of the text into spoken words and output it through the computer's sound card. Speakup supports both hardware and software synthesizers, though software synthesizers require a user-space program and thus can't load at kernel boot, as we'll discuss later. Speakup's key features include seamless integration, logical key layout, support for laptop keyboards, easy adjustability of speech settings and support for software synthesizers.

Features
Speakup is packed full of features, some of which you won't find in any other screen reader. In order to read text, Speakup uses an invisible review cursor. At the same time, however, Speakup tracks the system cursor, to facilitate navigation in menus, editors and similar situations. To perform tasks such as moving the review cursor around, Speakup uses the numeric keypad, hereafter referred to as the numpad.

The numpad Enter key silences speech until the next key press, which is very useful for quieting boot-up messages and/or frequently heard text. It also synchronizes the location of the review cursor with the system cursor, facilitating many different operations. Insert plus numpad Enter silences reading of new text until this combination is pressed again, but still allows you to move around the screen.

The numpad plus key reads the entire screen. The numpad 0, or insert, is used as a key modifier similar to Alt, Ctrl or Shift. Speakup also respects numlock, still allowing the user to enter numbers from the numpad if necessary. Numpad keys 7–9 go up a line, read the current line and go down a line, respectively. A similar arrangement is used for words on numpad 4–6, and with characters on numpad 1–3. The numpad slash marks a spot on the screen, and if there is a spot already marked, it copies the text into memory. Insert plus numpad slash inputs any previously copied text, which usually results in pasting it to the location of the system cursor.

The numpad minus parks the review cursor. Parking means that the review cursor's location will not be moved unless the user moves it; this is useful for tracking text that changes but is not at the cursor, requiring you to move to it constantly. This functionality is also in the windowing system, which will be covered shortly. Numpad star toggles on and off cursor tracking. This is different from parking the review cursor, because parking does not affect what is actually spoken, just where the review cursor is. Cursor tracking always speaks what is at the cursor, which is optimum for menus and editors, but occasionally you may need to turn it off.

Laptops
For laptops, Speakup has a set of key assignments as well. These center around the Caps Lock key or Windows logo key if it is present on the keyboard. While the Caps Lock key is down, the letters I, O and U act as the numpad 7–9. Thus, you have a very similar arrangement to what you have on the numpad. Some things are different—for instance, Caps Lock plus Enter acts as numpad Enter, but overall it's very similar and easy to learn. When referring to either the the Caps Lock/Windows key or numpad Insert key simultaneously, they are called the Speakup key.

Adjusting Settings
Adjusting speech settings, such as volume, rate, pitch and tone, can be done in two ways.

The first, and probably the easiest, is to use the Speakup key plus the numbers on the number row. The Speakup key plus 1 and 2 decrement and increment the volume, respectively; 3 and 4 do the same with pitch; and finally 5 and 6 do the same with rate. The Speakup key plus F9 and F10 control punctuation, and the Speakup key plus F11 and F12 control the punctuation only for reading.

The Speakup key plus F5 lets you edit the “some” punctuation level. It works by toggling the punctuation that you press, as to whether it is spoken in the specified level. The Speakup key plus F6 does the same for the “most” punctuation level, and Speakup key plus F7 lets you edit what delimiters are used when moving by words; usually it is spacing and certain punctuation.

The other method of changing speech settings is to use the Speakup entry under /proc. Under /proc/speakup, there are the usual items, such as volume, rate, pitch, voice, version and synth_name, as well as some more-advanced items dealing with timing and other things. Some of these values are read/write, and some are read-only. For instance, version gives the current revision of Speakup, including the CVS build date if applicable, but synth_name can be used both to get and set the synthesizer in use. synth_direct is a write-only entry that sends all text directly to the synthesizer. It is even possible to load a new keymap while the system is running, rather than having to rebuild the kernel. There are also values for punct_some, punct_most and delimiters, which do the same things as the key functions described above. There is also a script called speakupconfig, which saves all of your entries in /proc/speakup for the particular synthesizer in use and allows you to restore these settings later, allowing automated loading of settings.

Windows
Speakup has a windowing system, which can be very useful in certain programs where a specific area of the screen that is not tracked by the cursor is updated frequently. The Speakup key plus F2 is used to set the window dimensions; the Speakup key plus F3 clears the window settings, allowing you to set a new one; and the Speakup key plus F4 silences the window, preventing it from being read automatically. However, you can read windows manually with the Speakup key plus the numpad plus key.

Work is now being done on color and highlighting recognition, which will allow ncurses-based programs to function even better than they do now, especially in menus. This means that text that is a different color from surrounding text will be given a higher priority, thus read first.

Help
There are several ways to get help on Speakup. First, you can load the module called speakup_keyhelp, and press the Speakup key plus F1. This puts you in a key identification mode, which can be exited by pressing the spacebar. When in this mode, Speakup speaks the description of any key that is assigned to a Speakup function, and allows you to arrow through the list of assignments. Another way to get help is to consult the guide provided with Speakup under Documentation in the kernel tree, or on the Web site. This document has many useful instructions, which can get a new user started with Speakup, as well as refresh an existing user's memory.

Installation
The number one thing that sets Speakup apart from other screen reader programs is the fact that it is literally part of the kernel. The install script applies a few patches to some kernel source files and copies the relevant Speakup sources to drivers/char in the kernel tree. Then, when make config is executed, there is a section for console speech output and Speakup. There you can choose what synthesizers you would like to build directly in to the kernel or as modules, though software speech support can be built only as a module.

You can also select what synthesizer you want to be the default at startup. Thus, if you build everything in to the kernel, you have a fully talking Linux system from startup to shutdown. This allows a blind person to install Linux without any sighted assistance whatsoever, because every step in the installation talks.

There are Speakup-modified ISO images for three major distros: Debian, Fedora and Slackware. Slackware has actually incorporated Speakup into its official installation setup, simplifying things even further. There is also a Speakup-enabled version of Knoppix, which is a basic Linux distro on CD. This allows people wanting a quick look at a Linux system simply to boot the CD, have it come up talking and not have to worry about installation unless they're interested. It also can be very useful for crash recovery.

Software Speech
As previously mentioned, Speakup supports software speech synthesizers with some user-space support. Some of the more famous software synthesizers include Festival, Flite, Freetts and IBM's VivaVoice Outloud, which is no longer supported. Software speech in Speakup centers around another program called Speech Dispatcher. Speech Dispatcher is a framework to provide a single interface to multiple software synthesizers. It does this through a series of programs that provide a Speech Dispatcher interface to elements such as Emacs as well as libraries for a number of languages. It also has a tcp protocol for transmitting speech from a server to client that does the actual output.

Speakup has a generic software synthesizer driver called /dev/softsynth, which outputs the text that would normally be sent to a hardware synthesizer. A module for Speech Dispatcher, called speechd-up, takes the text from /dev/softsynth and sends it to Speech Dispatcher and a software synthesizer of the user's choice. Support exists for Festival, Flite, Dectalk software and generic synthesizers. You also can integrate other synthesizers with some tweaking of configuration files. Performance-wise, software synthesizers have a slight lag in responsiveness compared to hardware synthesizers, but the overall result is not that bad given the circumstances.

The first step is to get Speech Dispatcher working, which is not hard at all; just compile it and you're set to go. You have to edit the configuration file to tell it what synthesizer you want to use; by default it uses Flite. Then, compile and install speechd-up. To start software speech, load the speakup_sftsyn module if you haven't already, and run speechd-up. If you do this through an init script, you still will get an early-talking system, though not entirely in the kernel.

Future
Many things are planned for Speakup in the future. As has been previously mentioned, work has been started on color recognition and highlight tracking, thanks to some folks at the American Printing House for the Blind. This will enable many menu-based programs to talk much more smoothly.

Another new feature that is planned is keyboard macros, allowing the user to accomplish many different tasks with the press of one key. We eventually want to have a screen memory find function, as well as a goto function to go to a specific set of coordinates on the screen.

Another matter that is under consideration and analysis is configuration files. These files would somehow have to be loaded in on execution of their corresponding program, and would contain voice, macro and other information necessary for the operation of that program.

All of these and more features are planned for Speakup in the future, provided that people are willing to help and contribute their time to the effort of making Linux accessible to the world's blind population.

Conclusion
Today, technology has revolutionized the lives of the world's blind population. Computers allow us to access data more easily than ever, and the arrival of the Internet into the mainstream has made communication and linking with others easier than ever before for everyone. Linux systems are economical by their nature, not requiring the absolute latest hardware to run well. This is especially helpful for the world's blind, who may not have access to as much funding as would be ideal. Now there is a cheap and workable solution for those people, a fully talking Linux system with Speakup; and with the introduction of software speech and Speech Dispatcher, it just got even cheaper.

Resources for this article: /article/8586.

Ameer Armaly is a sixteen-year-old junior in high school. He has been blind since birth, and enjoys programming, food and science fiction. He uses computers with the aid of talking programs that read the text aloud, sometimes as f

Saturday, January 23, 2010

story of courage

(taken from the internet)
Armless Girl Gets A Pilot License - Bravo




















--------------------------------------------------------------------------------



“WHERE THERE IS A WILL, THERE IS A WAY”






Jessica Cox, 25, a girl born without arms, stands inside an aircraft. The girl from Tucson, Arizona got the Sport Pilot certificate lately and became the first pilot licensed to fly using only her feet.




Jessica Cox of Tucson was born without arms, but that has only stopped her from doing one thing: using the word "can't."




Her latest flight into the seemingly impossible is becoming the first pilot licensed to fly using only her feet.



With one foot manning the controls and the other delicately guiding the steering column, Cox, 25, soared to achieve a Sport Pilot certificate. Her certificate qualifies her to fly a light-sport aircraft to altitudes of 10,000 feet.



"She's a good pilot. She's rock solid," said Parrish Traweek, 42, the flying instructor at San Manuel's Ray Blair Airport.



Parrish Traweek runs PC Aircraft Maintenance and Flight Services and has trained many pilots, some of whom didn't come close to Cox's abilities.



"When she came up here driving a car," Traweek recalled, "I knew she'd have no problem flying a plane."









Doctors never learned why she was born without arms, but she figured out early on that she didn't want to use prosthetic devices.







In Happy moments, praise God.
In Difficult moments, seek God.
In Quiet moments, worship God.
In Painful moments, trust God.
In Every moment, thank God.