Thursday, April 28, 2016
Monday, April 18, 2016
Tuesday, April 12, 2016
Expresso app will show you everything wrong with your writing
If you want to see how well you’re writing, the free text-analyzer Expresso will give you a detailed breakdown of whatever you give it.
The web-based Expresso, spotted on Product Hunt, allows you to type or paste in text to see different metrics of your writing. Hit the "Analyze text" button to see things like what percentage of sentences are extra-long or short, which words are filler and which verbs are weak. After clicking different metrics, Expresso will highlight the corresponding text with different colors.
You can highlight different parts of text in Expresso. Here, weak verbs are blue and filler words are green.
IMAGE: EXPRESSO
Expresso, currently in beta, lets you closely examine how you write and how often you use certain types of words. By analyzing my own writing, I can see that I often use clustered nouns. What are clustered nouns? Expresso will tell you they are "three or more consecutive nouns." Hovering your cursor over any metric that you don't know will give you a concise explanation, so don't worry if you can't pick out a nominalization or modal on your own.
Expresso explains any writing metrics that might be confusing to non-English majors.
IMAGE: EXPRESSO
While Expresso gives you all these helpful extensive metrics, that's all it really does, so don't go uninstalling Microsoft Word. There are no options to format the text in any way or save it, so it's more of a handy tool that fits alongside a good text editor.
Expresso has an obvious appeal for writers, but it could also be a helpful for editors or teachers to quickly paste in text and catch weak parts that could be overlooked or get an idea for writers' different patterns.
Wednesday, April 6, 2016
Facebook’s iPhone app is helping blind people see
On April 4, Facebook rolled out automatic alternative (alt) text for visually impaired and blind users on its iOS app. The feature uses artificial intelligence to scan the picture, recognize objects and create a text description. The feature is available immediately on the Facebook for iPhone as long as users have Apple’s built-in screen reader VoiceOver enabled.
Currently the feature is only available for English-speaking iPhone users in the U.S., U.K., Canada, Australia, and New Zealand, but the company’s recent blog post explaining the technology states it hopes to roll it out to other platforms, languages and countries soon. The blog post also says serving the 39M worldwide blind population and 246M visually-impaired population furthers Facebook’s mission to “make the world more open and connected.”
Facebook used object recognition technology to create alt text based on, “a neural network that has billions of parameters,” and is trained with millions of examples from Facebook’s large repository of images.
The feature promises the ability to recognize objects, people, basic emotions and background locations. In our test of the new feature, only one out of three pictures had alt text enabled, but the one that did was accurate. While the tech is still clearly a work in progress, alt text makes scrolling through Facebook far more engaging when it works. Facebook is an extremely visual platform, and without alt text, each photo is merely described by VoiceOver as a “photo.” The company noted that this can lead to blind and visually impaired users feeling left out from the fun of the social app, linking to a recent joint study conducted with Cornell University.
Microsoft featured a similar technology called Seeing AI at its recent Build conference in San Francisco, California. Seeing AI provides the same simple recognition as the Facebook feature, able to recognize objects, people and basic emotions. Google is working on image content analysis technology as well, with Cloud Vision API.
Monday, February 22, 2016
Google Cardboard lets woman see for the first time in 8 years
It seems Google’s low-cost virtual reality headset is useful for more than just simple VR tech demos.
A woman named Bonny (her last name is not listed in the YouTube video) suffers from Stargardt disease, a common eyesight problem that causes photoreceptor cells in the retina to deteriorate, and in rare cases leads to complete vision loss. Thanks to Google Cardboard, as well as a free app called Near Sighted VR Augmented Aid, she is now able to see for the first time in eight years.
Near Sighted VR takes the video feed from a smartphone’s rear facing camera and delivers that same video to each eye, creating a stereoscopic image. This is similar to how the Gear VR’s live view mode works, giving the user a digital view of the world around them while still wearing a virtual reality headset.
Apparently viewing a stereoscopic image so close to her face, allowed Bonny to regain some eyesight. A recent video uploaded to YouTube, showed Bonny being able to see her family for the first time in nearly a decade.
In the past, Cardboard has been used to help doctor’s successfully preform a high-risk surgery.
Tuesday, February 9, 2016
'Bionic spinal cord' aims to move robotic limbs with power of thought
Australian scientists hope a device about the size of a matchstick will one day help people with spinal cord injuries get back on their feet.
The device, a stent-electrode recording array or stentrode, could allow patients to control powered body armour, known as exoskeletons, or bionic limbs using only their thoughts, researchers announced Tuesday at the University of Melbourne.
The stentrode will be implanted in a blood vessel that sits over the brain, and will turn brain signals into electrical commands that could wirelessly move the exterior mechanical technology. Currently, most exoskeletons are controlled by a joystick that is operated manually.
A collaboration between the Royal Melbourne Hospital, the Florey Institute of Neuroscience and Mental Health and the University of Melbourne, the findings were published Tuesday in the journal, Nature Biotechnology.
Speaking at a press conference Tuesday, Thomas Oxley, a neurologist at the Royal Melbourne Hospital, said the project began when he had the idea that thought control of bionic limbs could be achieved without implanting a device through open brain surgery.
Preclinical studies have shown that by putting the stentrode in a blood vessel near the motor cortex, a key control centre for the brain, you can get the same recording as people previously obtained by surgically inserting something directly into the brain. He proposed that, using the stentrode, patients will one day be able to control mechanical limbs with their thoughts.
"The idea is the device is much less invasive than previous attempts at doing this and can be implanted longterm," said Terry O'Brien, head of medicine at the Departments of Medicine and Neurology, the Royal Melbourne Hospital and University of Melbourne. "There is no clinical device that does this at the moment."
It could also have applications far beyond assisting those with paralysis, he added. The stentrode could be used to record brain waves for those with conditions such as epilepsy, helping to predict when they are about to have an attack. "The applications are incredibly broad, and that's what makes it so exciting," O'Brien said.
The device, which has so far been tested on sheep, will undergo its first human trials in 2017. According to a statement from the University of Melbourne, patients will have to, in many ways, learn to walk and stand again by getting familiar with "coding" the signals to their exoskeleton. "With our device, you've essentially connected an electronic limb to the patient's brain, but they have to learn how to use it," Oxley said, according to the statement.
Nick Opie, a biomedical engineer at the University of Melbourne, told reporters the team hoped the cost of the device, after it has undergone human testing and is ready for market, would be similar to the cochlear implant — around A$15,000 (US$10,567) to A$20,000 (US$14,089). They predict it will be ready by 2022.
It is also hoped the stentrode will be as important to medicine as the cochlear implant, which was invented in Australia. "What cochlear implants have done for hearing, we want to do for mobility," Opie said.
Interest in bionic limb and exoskeleton technology has been developing rapidly in recent years. Earlier in February, SuitX announced a newPhoenix exoskeleton that aims to replicate human gait.
Friday, December 18, 2015
Robotic 'glove' will allow the blind to see with their hands
A team of researchers from the University of Nevada, Reno and the University of Arkansas, Little Rock announced on Thursday that they are developing a robotic wearable which will enable the visually impaired to "see" what they're reaching for.
"We will pre-map the hand, and build a lightweight form-fitting device that attaches to the hand using key locations for cameras and mechanical and electrical sensors," Yantao Shen, assistant professor and lead researcher on the project from the University of Nevada, Reno's College of Engineering, said in a statement. "It will be simpler than a glove, and less obtrusive." The device will combine tactile and temperature sensors, high resolution cameras and miniaturized microphones to sense items around it. This information -- the item's location, size and shape -- is then relayed to the user via haptic and audio feedback.
"Not only will this device help blind and visually impaired people, the methods and technology we develop will have great potential in advancing small and wearable robot autonomy," Shen added. The technology is still in development and there is no word on when it will actually be made available to the public.
Source: University of Nevada, Reno
Subscribe to:
Comments (Atom)

