Subscribe to our free daily newsletters
. 24/7 Space News .




Subscribe to our free daily newsletters



TECH SPACE
Computer scientists simplify deep learning
by Brooks Hays
Washington (UPI) Jun 1, 2017


Computer scientists at Rice University have developed a new technique for minimizing the amount of computations required for deep learning.

The simplification technique is similar to methods commonly used to minimize the amount of math required for data analysis.

"This applies to any deep-learning architecture, and the technique scales sublinearly, which means that the larger the deep neural network to which this is applied, the more the savings in computations there will be," lead researcher Anshumali Shrivastava, an assistant professor of computer science, said in a news release.

Deep-learning networks hold tremendous potential in a variety of fields, from healthcare to communications. The networks are still cumbersome, requiring significant amounts of computing power.

Scientists regularly use a data-indexing method called "hashing" to slim-down large amounts of computation. Hashing converts larger numbers and computations into smaller datasets and then organizes the information into indexes.

"Our approach blends two techniques -- a clever variant of locality-sensitive hashing and sparse backpropagation -- to reduce computational requirements without significant loss of accuracy," said Rice graduate student Ryan Spring. "For example, in small-scale tests we found we could reduce computation by as much as 95 percent and still be within 1 percent of the accuracy obtained with standard approaches."

Deep learning relies on artificial neurons, mathematical functions that turn one or more received inputs into an output. Neurons in deep learning networks begin as blank slates but become more specialized over time as they learn from the inflow of data. As neurons become more specialized, they form a hierarchy of functions.

Low-level neurons perform simple functions, while higher-level neurons are responsible for more sophisticated computations. More layers can yield more sophisticated and powerful results. But more layers require more time, space and energy.

"With 'big data,' there are fundamental limits on resources like compute cycles, energy and memory. Our lab focuses on addressing those limitations," Shrivastava said.

Researchers believe their latest efforts will allow computer scientists to more efficiently deploy large deep learning networks.

"The savings increase with scale because we are exploiting the inherent sparsity in big data," Spring said. "For instance, let's say a deep net has a billion neurons. For any given input -- like a picture of a dog -- only a few of those will become excited. In data parlance, we refer to that as sparsity, and because of sparsity our method will save more as the network grows in size. So while we've shown a 95 percent savings with 1,000 neurons, the mathematics suggests we can save more than 99 percent with a billion neurons."

Shrivastava and Spring are scheduled to share their work at the SIGKDD Conference on Knowledge Discovery and Data Mining, to be held in August in Nova Scotia.

TECH SPACE
HP Enterprise unveils computer 'for era of Big Data'
Washington (AFP) May 16, 2017
Researchers from Hewlett-Packard Enterprise on Tuesday unveiled what they claimed was a breakthrough in computing with a new machine capable of handling vast amounts of data at supercomputing speeds. The prototype named simple "the Machine" uses a new approach to computer architecture which the company says can be adapted for a range of Big Data applications, handling tasks at thousands of t ... read more

Related Links
Space Technology News - Applications and Research


Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.

SpaceDaily Contributor
$5 Billed Once


credit card or paypal
SpaceDaily Monthly Supporter
$5 Billed Monthly


paypal only

Comment using your Disqus, Facebook, Google or Twitter login.

Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

TECH SPACE
First Year of BEAM Demo Offers Valuable Data on Expandable Habitats

Conch shells may inspire better helmets, body armor

MIT researchers engineer shape-shifting food

DARPA Picks Design for Next-Generation Spaceplane

TECH SPACE
Dream Chaser Spacecraft Passes Major Milestone

NASA's Space Launch System Engine Testing Heats Up

Successful launch puts New Zealand in space race

Russia to create new Super-Heavy Class rocket after 2025

TECH SPACE
Student-Made Mars Rover Concepts Lift Off

Illinois Company Among Hundreds Supporting NASA Mission to Mars

Preparations Continue Before Driving into 'Perseverance Valley'

Schiaparelli landing investigation completed

TECH SPACE
California Woman Charged for Trying to Hand Over Sensitive Space Tech to China

A cabin on the moon? China hones the lunar lifestyle

China tests 'Lunar Palace' as it eyes moon mission

China to conduct several manned space flights around 2020

TECH SPACE
New Horizons for Alexander Gerst

Government space program spending reaches 62B dollars in 2016

New Target Date for Second Iridium NEXT Launch

Satellite industry supports FCC proposal to reduce internet regulations for service providers

TECH SPACE
New method allows real-time monitoring of irradiated materials

Solving the riddle of the snow globe

Bamboo inspires optimal design for lightness and toughness

Computer scientists simplify deep learning

TECH SPACE
Russia thinks microorganisms may be living outside the space station

The race to trace TRAPPIST-1h

Water forms superstructure around DNA, new study shows

How RNA formed at the origins of life

TECH SPACE
A whole new Jupiter with first science results from Juno

First results from Juno show cyclones and massive magnetism

Jupiters complex transient auroras

NASA's Juno probe forces 'rethink' on Jupiter




Memory Foam Mattress Review
Newsletters :: SpaceDaily :: SpaceWar :: TerraDaily :: Energy Daily
XML Feeds :: Space News :: Earth News :: War News :: Solar Energy News






The content herein, unless otherwise known to be public domain, are Copyright 1995-2017 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement