Friday, December 18, 2015

Robotic 'glove' will allow the blind to see with their hands


A team of researchers from the University of Nevada, Reno and the University of Arkansas, Little Rock announced on Thursday that they are developing a robotic wearable which will enable the visually impaired to "see" what they're reaching for.
"We will pre-map the hand, and build a lightweight form-fitting device that attaches to the hand using key locations for cameras and mechanical and electrical sensors," Yantao Shen, assistant professor and lead researcher on the project from the University of Nevada, Reno's College of Engineering, said in a statement. "It will be simpler than a glove, and less obtrusive." The device will combine tactile and temperature sensors, high resolution cameras and miniaturized microphones to sense items around it. This information -- the item's location, size and shape -- is then relayed to the user via haptic and audio feedback.
"Not only will this device help blind and visually impaired people, the methods and technology we develop will have great potential in advancing small and wearable robot autonomy," Shen added. The technology is still in development and there is no word on when it will actually be made available to the public.
In this article: gearsciencewearables

Monday, November 16, 2015

Breaking a different kind of language barrier: Sign language becomes sensor-based



American Sign Language is the bridge that connects deaf and hard-of-hearing people, in large part, to the world of traditional interpersonal communications. But how to communicate with ASL when a partner in a given conversation cannot interpret the visually-based language?
Seeking to close that kind of communications gap, work is underway at Texas A&M University. Roozbeh Jafari, associate professor and principal investigator with the school’s Department of Biomedical Engineering — and researcher at its Centre for Remote Health Technologies and Systems — is developing a newly sophisticated tool to make ASL understandable to everyone.
The results of Jafari’s project, and the long-term implications that stem from it, could change the way we approach interfacing with each other — and even with technology — all based on our hands, muscles and movements.

Vision Quest: Recent Challenges for ASL Translation

The ASL translation system doesn’t have an official name, yet, but what it’s doing — and what it stands to do — is concrete and apparent. The goal is to translate ASL for all participants in a way that proves more accurate, more portable and more reliable than ever before.
“There have been a few systems for translating American Sign Language automatically,” said Jafari, regarding devices that precede the new technology he is working to refine. “The most prominent among them have been based on cameras and vision … you would basically stand in front of a camera and the camera would track hand motion.”

It is a system of turning visually tracked movement into words. But the cameras that facilitate it, only work well when the specific ASL gestures tracked are precise enough for the computer on the other end of the equation to recognise. Failure to hit the mark precisely, however, can mean the conversation between an ASL user and the non-ASL-using participant becomes difficult. Words get lost. Communications break down. Add in challenges around where a camera can be placed in a room full of ASL-using participants. Further add that users have to carry around a motion-tracking camera everywhere.
In all of these factors, Jafari saw the need for a different ASL-interpreting tool.

Beyond Vision: Jafari’s Motion- and Muscle-Tracking Approach to ASL Translation

In Jafari’s project, the camera is out of the picture. Instead, his technology applies an external motion sensor and a wearable muscle-tracking sensor to create a new version of ASL translation.
“The sensor is based on EMG, or electromyogram technology,” Jafari said, referring to sensors the Mayo Clinic describes as measuring electrical signals — ones that our motor neurons transmit to muscles, causing them to contract. EMGs turn can these signals into numerical values computers and specialists are able to interpret.
“Combined with the external motion sensors, which show us the overall hand movement, the EMG allows us to discriminate between gestures,” he said. “A fine-grain of interpretation … motion sensors give us the overall sense and muscle activities give us information about the fine-grained intent.”

Next Steps: Focusing on the Details of New ASL Tech

The team has produced an operational proof-of-concept model, when it comes to the ASL-interpreting technology underway at Texas A&M. The next step is to refine the sensitivity and accuracy of the devices.
  • Currently, every wearer of the EMG sensor, every time they don the device, must be careful to position the wearable tech in a precise way, otherwise the system must be “retrained” to register the ASL vocabulary the user employs. Jafari also stated that they’re working on ways to “make the system smarter, in a sense … to reduce or eliminate training time.”
  • At present, Jafari’s system recognises individual words, but requires a pause between them. As the team develops their work further, the goal is for the translation engine to combine the input it receives into whole phrases and sentences — more akin to the way humans naturally communicate.
  • The third prong of development is to increase the vocabulary of the technology, overall.
When all of Jafari’s developing tech is operating at the advanced level he describes, ASL users and their conversation partners will clearly benefit. But the applications of the sensor-based system extend beyond sign language and translation alone.
“When you think about it, you can use this for many other applications,” he said. “Think about your house … you might have a smart house, but right now to turn on and off all your devices you need to go to your mobile phone, each app, and then use them. What if you could control your house with hand gestures, communicating with each of your devices?”
For Jafari, starting with the mission to further facilitate ASL among all participants — and then extending into the home and future applications — the conversation is just getting underway.

Thursday, November 12, 2015

How an iPhone changed a paralyzed veteran's life



While serving in Iraq as a medic in the U.S. Army, Ian Ralston was hit with a tiny ball bearing from an IED in 2010. He was left paralyzed from the neck down. 
Despite the hardship, Ralston quickly adopted a "make the best of it" attitude. "To me, there's absolutely no point in being upset about it," he told the told the WCF Courier nearly a year after he was injured. 
"I mean, yeah, it sucks. And if I had my choice, no, I wouldn't be in a wheelchair. But this is what I got right now. So I might as well make the best of it. If all you want to do is be upset about it, it's just going to make it that much harder for you to live with it, to cope with it. So I got past that quick."
That attitude has helped Ralston thrive. He's married, has newborn twins and uses a special wheelchair that helps him accomplish daily tasks—even operating his smartphone. 

Life changed by a smartphone


Like most new parents, Ralston posts photos of his children to Facebook, where he also posts status updates about his life.
Ralston does this using an iPhone 6 Plus — a phone he got eight months ago. The iPhone is Ralston's first smartphone.
"I got injured before the smartphone craze, so getting one now is really cool," Ralston tells me.
Many of us are familiar with accessibility options for visually impaired individuals; screen readers make it possible for users who are partially or fully blind to still read a computer screen. Dictation software has made it possible for those who can't move their arms or fingers to communicate.
But accessibility tech goes a lot further than that. Ralston, for example, is able to operate his iPhone 6 Plus using his mouth. His wheel chair uses what's called a sip-and-puff system. Blowing or sucking into a tube can move his chair around. It can also control the screen on his iPhone.
The iPhone connects to Ralston's chair using a small device called a Tecla Shield. This is basically a bluetooth adaptor that connects his phone — which is mounted on his wheelchair — to the wheelchair's sip-and-puff controls. 
The Tecla Shield is designed to work with iOS devices running iOS 7 or higher. Using a feature called Switch Control, Ralston can control and access different parts of the screen using his mouth.
For Ralston, the impact having a smartphone has had on his life has been both big and small.
Ha
Having access to a phone gives him freedom and a sense of independence he didn't have before. "I'm on a vent because I can't breathe on my own," he explains. In the past, this meant that if he couldn't leave the house alone — in case something happened (a battery got too low, some tubing started to come out). Now, he has some independence and call his wife on her phone if he needs assistance.

"That was something I could never do before."
But it's not just the big stuff. Like most guys his age, Ralston is into fantasy football. He told me about how he recently was out and got a notification of an important movement that would affect his lineup. Using Yahoo's official Fantasy Football app, he was able to make changes to his team based on that news. "It sounds really trivial," Ralston concedes — but I disagree.
The reality is that most of us use our phones for a mix of reasons. The fact that Ralston can change his lineup no matter where he is — just like anyone else — is huge.
"I send texts [using the built-in dictation feature] and check Facebook like anyone else," he says. He also uses Siri to look some stuff up — a game score or the weather. 
The iPhone isn't Ralston's first experience with adaptive tech. He has used Dragon's Naturally Speaking software on his PC laptop for a few years to dictate emails or posts. He also uses his mouth to control his computer's mouse.
Still, the iPhone is a new experience because it is attached to the chair and as a result, can go everywhere with him.
A glitch between Ralston's phone and his chair meant he couldn't use the two together for a few weeks. "I was pissed off!" he tells me, surprised by how quickly he became reliant on the device.
Ralston almost seemed embarrassed by his addiction — but as I pointed out to him — that's the same way all of us — regardless of our physical abilities — feel about our phones. If my iPhone broke tomorrow, I know exactly how long I would be able to last without it: As long as it took to get to the nearest Apple Store to buy a new one.

Letting people know this is out there

Ralston's setup between his wheelchair and his iPhone was all covered by Veteran's Affairs.
As far as he knows, he was the 16th person in Washington state to get it. More people, he thinks, should know that this is available.
"And it's not just for vets or those of us with spinal injuries," he says. "People with ALS can benefit, too."
Getting it setup and certified through the VA was seamless, he says — at least in Seattle. "It takes about four hours to install," he says and then it took an hour or two for him to get the hang of.
Still, setup was intuitive and Ralston was able to play around while on the way back from getting it installed on his chair.

Getting developers on board

Although Ralston has success with most of the apps he wants to use, it's worth noting that not all apps are built with accessibility in mind.
When I asked him about any apps in particular that don't work well with his setup, he called out transportation maps like Apple Maps and Google Maps. "I like to look at maps and places," he explains, and it can be difficult with the current apps.
Apple makes it relatively easy for developers to add accessibility support — including access to Switch Control — in its apps. If an app supports Apple's screen reader, it probably also supports Switch Control.
De
Developers don't always focus on accessibility until they hear from someone affected. And Ralston says he often doesn't want to complain because he's just so happy he has access to a phone at all.

Apple and other companies do actively work with vets and with the community to make accessibility better.
Ralston's advice for developers is to start playing with the accessibility settings on their own. There are YouTube videos that show how the features work and by trying to use an app without typical input, developers can get a real-world idea of what it might be like for someone like Ralston to use an app.


Tuesday, November 10, 2015

Meet the runway model with one of the world's most advanced bionic arms



Rebekah Marine, a New Jersey-born congenital amputee, nearly gave up on her dream of becoming a fashion model when a casting director told her she would never make it. But she didn't let that stop her. In recent years, she has strutted down some of the fashion industry's most exclusive runways, all while modeling one of the most advanced bionic arms on the market.

Tuesday, August 25, 2015

Can technology make a hearing-centric world more accessible?




Most humans live in a hearing-centric world, which means that the needs and preferences of people who are deaf or people who have some form of hearing loss are often overlooked. But some people are trying to change that — and they're using technology to do it.
On this week's episode of Top Shelf, you'll see how Gallaudet University researchers are using motion capture technology and interactive apps to ensure that children who are deaf are exposed to language at an early age. Then, you'll meet the owner of Digital Media Services — a company that does closed captioning for Hulu, Netflix, and even Nicki Minaj music videos. Finally, you'll get a glimpse at the changes taking place in the world of hearing aids.
A WORLD THAT'S TRULY ACCESSIBLE IS ONE THAT'S ADAPTABLE — AND INCLUSIVE


    A world that's truly accessible is one that's adaptable — and inclusive. We aren't there yet, but it's still pretty amazing to see what can happen when technology and human creativity come together.

    Friday, August 7, 2015

    New smartwatch displays texts in braille



    A new smartwatch could allow vision impaired smartphone users to check their messages without having to play them out loud.
    The Dot is a tiny wearable with perforated holes featuring magnets and pins inside that rise to form the content of messages in braille, making them much easier to read for the vision-impaired. So when a user receives a text message, corresponding dots pop up in braille and he or she can run their finger along the watch to read the text.
    The device comes from a Korean startup of the same name looking to make new technology more accessible.

    Often times, visually-impaired people rely on a smartphone's speaker to read messages out loud, but the method is not very private and could disturb others in public. However, the smartwatch allows visually-impaired users to read their texts discretely.
    “Until now, if you got a message on iOS from your girlfriend, for example, you had to listen to Siri read it to you in that voice, which is impersonal,” Dot CEO Eric Ju Yoon Kim toldTech in Asia. “Wouldn’t you rather read it yourself and hear your girlfriend’s voice saying it in your head?”
    Aside from text messages, the wearable also relays directions and can serve as an alarm. Of course, it tells the time too. There are other helpful features such as haptic engine to buzz on a user's wrist when a new message is received.
    There's another big improvement with this device over existing tech: cost. A portable computer that can relay messages in braille can cost up to $3,000, while the Dot will retail for $300 when it launches in December. It will sync up with both iOS and Android devices via Bluetooth.

    Thursday, July 16, 2015

    Hands-free wheelchair prototype 'Ogo' built in Kapiti shed of dismantled Segway


    Otaki inventor Kevin Halsall shows off the Ogo, a hands-free electric wheelchair made from a rebuilt Segway.
    A Segway rebuilt into a hands-free wheelchair with a top speed of 20kmh is on the verge of mass production.
    The Ogo, built in an Otaki shed by Kevin Halsall has reached its third prototype, and with a few more tweaks will be ready for sale - with the help of investor support.
    There is only one Ogo in existence - a hand-built fibreglass prototype, which would only need a few more tweaks before being production-ready.


    Joel Maxwell
    Kevin Halsall on his Ogo - a hands-free electric wheelchair created from a rebuilt Segway.


    Halsall built the first version of the Ogo four years ago with his mate Marcus Thompson, who is a paraplegic, in mind.
    It started when Halsall first hopped on a Segway.
    "The first thing I thought was 'if I didn't have my legs this would be the perfect thing I'd be adapting'."
    In its first version he borrowed a Segway and added a bolt-on seat, but "it needed to go a stage further", he said.
    "The steering and the sensing of it needed to be refined more, and the only way I could do that was getting into the guts of the Segway."
    That meant he had to buy his own Segway, costing about $14,000, which he could strip apart.
    Halsall's work left only the "bare bones" of the Segway, with a new patented moving seat control installed.
    Mastering a Segway can be a challenge - the devices operated by leaning on the handlebars to steer, and transferring weight back and forwards for accelerating and breaking.

    The moving seat made the acceleration and braking more responsive to movements from the rider's core muscles, he said.
    His mate Thompson used the device to mow his lawns, and trialled it at his work as a teacher at Otaki College.
    Halsall said the doing things like picking up items and moving round while holding them, and mowing lawns, sounded mundane to most people.
    "But when you're in a wheelchair you just can't do it."
    Thompson got a buzz out of mowing his lawns, Halsall said, because as a paraplegic he previously relied on others to do work like that.
    The Ogo comes with additional wide wheels that allow it to become "an off-road monster", he said.

    Halsall said in ideal conditions the Segway powering the Ogo had a range of about 40 kilometres, but with everyday use would travel about 30km.
    "Marcus, he's had it going from his home to the school, all day at the school, in the classroom environment, then back home again."
    Halsall said the machine itself was perfectly safe but its high speed and control challenges created a "danger element".
    This, he said, was part of the attraction.
    "Nothing is really exciting unless it's got a bit of an element of danger."
    Halsall, a plastic products designer, built the Ogo in his shed on his property in Otaki.

    The Ogo is a finalist in the Innovate awards covering the region, competing against 10 other inventions in the finals at the end of August.
    It is part of a bid to generate interest in the device from angel investors, alongside a potential run for crowdfunding.
    He said he was still looking for someone with business and commercialisation expertise to launch the product globally.
    His goal is to start manufacturing the Ogo and then create new electronic products for people with disabilities.
    "The scale of the manufacturing side of things can be scaled-up as it grows. The more the better."
     - Stuff

    Tuesday, July 14, 2015

    Nike designed a sneaker for people with disabilities



    Whether it be in clothing or footwear products, Nike is known for never being afraid to experiment with new technologies. The latest example is the company's new Zoom Soldier 8, a gorgeous shoe that was designed for people facing disabilities -- such as amputees and those who have suffered a stroke or cerebral palsy. With the sneaker's Flyease tech, which features an unusual zipper mechanism that ties around the heel, Nike's made it easier for the disabled community to tie their shoes. Instead of having to use both hands to accomplish this task, something that may not be possible or easy for some, Flyease simplifies this by letting them rely on one hand to open or close the shoe.

    Nike Flyease press images  http://www.engadget.com/gallery/nike-flyease-press-images/



    As Fast Company reports, Nike began development of the zipping system seven years ago; CEO Mark Parker made a special request to Senior Director of Athlete Innovation, Tobie Hatfield, after an employee had a stroke and lost movement on his right hand. Despite Flyease being launched, though, Nike tells Fast Company that it will continue to research and improve the new mechanism. "We know we can continue to make improvements," he said, "but we wanted to give access to those who need this sooner than later."
    But let's hope that when the Soldier 8 Flyease launches, on July 16th, it ends up on the feet of people who could actually benefit from wearing the shoe. Here's Nike's main problem: resellers like to buy their sneakers and then post them on eBay at double, triple and sometimes quadruple the retail price -- which is bad news for everyone.

    Monday, July 6, 2015

    Supermom designs superhero hearing aids for kids

    UK mother Sarah Ivermee is a definite contender for supermom status.
    Ivermee's son Freddie suffers from hearing loss in one ear and deafness in the other. To help him hear throughout the day, he wears a cochlear implant, an electronic device that sends sound signals to the brain.
    Wearing hearing aids can be hard for young children, but that's where Ivermee comes in. When a friend mentioned that her daughter felt self-conscious about wearing her aids to school, Ivermee suggested adding nail stickers for a little flair.
    Now, Ivermee manages Lugs, a line of fun, colorful aids that kids will feel cool wearing — because let's face it, who wouldn't choose Spider-Man or Minions over plain blue?
    super-hearing-aids-spiderman

    IMAGE: MYLUGS
    super-hearing-aids-minions

    IMAGE: MYLUGS
    super-hearing-aids-all

    Thursday, June 25, 2015

    TC Makers: Kinova Robotics Gives The Disabled A Helping Hand

    Montreal-based Kinova Robotics was founded in part because of one man – the founder’s uncle, Jaco, a disabled inventor who created a manipulator for his wheelchair out of a hot dog pincher, some microswitches, and some complex electronics. This early attempt, while primitive, was the basis for a whole series of amazing – and amazingly useful – robotic arms.
    Kinova’s CEO Charles Deguire took TC Makers around his small factory outside of the city where he and his team are building robotic arms as fast as they can. The arms fit onto standard wheelchairs and allow folks to feed themselves, manipulate objects, and live an independent life. Currently the team sells arms all around the world and currently over 150 users in the Netherlands are feeding themselves using Jaco.
    The arms are lightweight and low cost and they are creating the entire modular system in their factory. They can build about five robots per day and are currently working on a new model to keep things lighter. Researchers are also using the system for research into machine learning and robotics.
    Kinova is an amazing company doing something amazing: it gives a helping hand to people who are disabled in an amazing way.

    Friday, June 12, 2015

    This Incredible Wheelchair Can Climb Stairs Like a Tank

    This Incredible Wheelchair Can Climb Stairs Like a Tank
    Wheel Chair climbing upstairs

    Technology has been making wheelchairs more convenient and easier to use, but this crazy amazing model that actually scales staircases is a metaphorical mic drop.
    This prototype, called the Scalevo, has rubber, tank-like treads mounted to the bottom of the chair. The user approaches a set of steps backwards with his or her back facing the steps. The treads sprout out, lifting the chair up at an angle, allowing it to crawl up the steps. The user is kept level at all times. The headlight-and-taillight-equipped chair has two extra sets of wheels that pop out at the last step to provide smoother transition back to flat ground.
    It began as a student project last summer, and now ten electrical engineering and industrial design students from the Swiss Federal Institute of Technology and the Zurich University of the Arts are working on it. They say the Segway was an inspiration. I guess there’s a first time for everything.
    The team wants it ready by next year’s Cybathlon, a race for people with physical disabilities that use assisting robotic tech, like exoskeletons and electrically stimulated muscles.

    Friday, June 5, 2015

    Hands-On With Doppler Labs’ Here Active Listening System




    In short, these are two independent ear buds that (with zero latency) give you complete audio control over your environment. If you want to reduce the volume of a crying baby or a screeching subway train (without turning everything down), the Here buds will let you do that. If you’re at a concert and you want to pump up the bass, the Here buds will let you do that, too.
    The product launched on Kickstarter two days ago with a $250K goal, and has surpassed that goal in around 48 hours. During the interim, we sat down with Doppler Labs founder Noah Kraft to discuss the product and get exclusive hands-on time with the app.
    I realize that this video doesn’t actually give you the opportunity to hear what the Here buds sound like, but you can watch me do it!

    Friday, April 24, 2015

    Disney Invented a Way To Control Your Phone Using the Sounds It Emits


    Disney Invented a Way To Control Your Phone Using the Sounds It Emits
    In an effort to bring more functionality and interactivity to a device that is often just a large monolothic touchscreen, researchers at Carnegie Mellon University and Disney Research have come up with a series of accessories that manipulate sound coming from a smartphone’s speaker to serve as an external controller.
    Called Acoustruments, the cheap plastic accessories direct an ultrasonic sound from a smartphone’s speaker back to its own microphone. But in-between the speaker and mic are obstacles that change or vary the pitch and intensity of the sound—similar to how moving the slide on a slide whistle can change the sound it produces.

    An accompanying app knows exactly what sound is being produced, and by comparing that to what the microphone ends up hearing, the Acoustrement can determine how a user is interacting with the device. The accessory could be as simple as a button that serves as a snooze button, or a wheel that changes settings like a dial, or a full-on smartphone case that knows when the device is sitting on a table, being held, or even squeezed like a camera.
    And because it doesn’t actually draw any power from the device (besides the constant ultrasonic tone being produced) this added functionality doesn’t hinder a smartphone’s already limited battery life. So eventually it could be implemented directly into a device’s housing, adding more buttons or ways to interact with a smartphone, without any negative consequences. [Disney Research]

    Wednesday, April 15, 2015

    How electrifying the brain wards off Parkinson's disease


    Implanting electrodes in the brain and zapping it helps patients with Parkinson's and otherdisorders, but doctors have never been sure why, exactly. Now, researchers from UC San Francisco think that the therapy (called deep-brain stimulation, or DBS) works by altering neural timings, in much the same way a defibrillator resets heart rhythms. In a healthy brain, neuron firing is controlled by low frequency rhythms that sync up movement, memory and other functions. But the UC team found that the synchronization is too strong in Parkinson's patients, making it harder for them to move voluntarily.
    The brain needs a balance between autonomy and rhythm, which is where DBS comes in. The implants appear to lower the overly lock-step synchronization, improving patient coordination and other symptoms. But the invasive, six-hour-long surgery requires implanting probes into deep brain structures and the patient must be awakened mid-surgery to test it out -- so anything that can make it more effective is helpful. According to the researchers, "we can (now) begin to think of new ways for stimulators to be automatically controlled by brain activity, which is the next innovation in treatment for movement disorders."