10 cool network and computing research projects
- 22 January, 2015 01:00
If you think the latest enterprise and consumer network and computer technologies rolling into your data center and being snuck into your offices by end users are advanced, wait until you see what's cooking in the labs at universities and tech companies. Much of well-funded research is aimed at security, simplifying use of current technology and figuring out how to more easily plow through mounds of big data. Here's at peek at 10 projects.
Idiot-proof Smartphone Charging
Microsoft researchers are working on technology to make smartphone charging much less of a burden for users. The techniques being explored include an image-processing technique for detecting and locating smartphones in an office as well as solar/photovoltaic cell technology that works indoors to AutoCharge phones via a beam of light.
The prototype developed showed evidence of being able to charge phones as fast as wire-based solutions, according to the researchers, and easier to use than current wireless charging techniques that require users to put their phones on a charging pad.
You'd think there's nothing lazy at all these days about supercomputing given how fast these beasts process data, but computer scientists at the Department of Energy's Oak Ridge Leadership Computing Facility are taking what they call a lazy approach to making the machines run more efficiently.
More specifically, the researchers are seeking a better way of "checkpointing" applications, that is, storing info about the app's state. They want to avoid checkpointing too often on high-performance computers, but want to do it enough so that if there is an app or system failure that minimal work will be lost. Their notion based upon their research is that errors tend to cluster around an original hardware failure, so checkpointing frequency should be increased at that point, but then eased off once things settle down. Such lazy checkpointing could reduce I/O volume 20% to 30%, and that would give supercomputers a real performance boost.
Getting a Fast Start
Cornell University researchers, along with colleagues from the University of Connecticut and other institutions, recently published a paper in the journal Nature that describes a theoretical and experimental discovery involving the use of a multiferroic material (bismuth ferrite) to build a memory device that conjures up visions of lightweight computers that start up even faster than some of today's quick-start offerings.
Their advance would potentially allow for smaller, more reliable and less energy-intensive devices by getting around the need to use electric currents to encode data even at room temperature. While their breakthrough is promising, they made just a single device, and it would take billions of them to build a usable computer memory system, according to Cornell.
Exploring the Cloud
The National Science Foundation is devoting $10 million for a multi-school effort to build better cloud computing infrastructure that's so important for researchers in fields spanning from physics to medicine to genetics.
The University of Wisconsin-Madison, University of Utah and Clemson University will each operate interconnected large-scale data centers for CloudLab, which will enable researchers in networking, storage and security to examine ways to bolster the cloud. Vendors such as Cisco will align with the schools on the project. The University of Massachusetts in Amherst, Raytheon BBN Technologies and US Ignite are also key players in the CloudLab effort.
University of Wisconsin computer science professor Aditya Akella said in a statement that "Almost all major services we depend on today rely on cloud computing. Our digital and physical lives are increasingly shaped by modern-day clouds."
Another $10 NSF-funded experimental cloud project, dubbed Chameleon, is anchored by the University of Chicago and the University of Texas at Austin, which will oversee a giant reconfigurable cloud infrastructure boasting 650 nodes and 5 terabytes of storage. This bare-metal cloud infrastructure is designed to enable researchers to work with new virtualization technologies.
Rice University researchers lead a four-year project backed by $11 million from DARPA to create a tool called PLINY designed to autocomplete and autocorrect code for programmers.
"Imagine the power of having all the code that has ever been written in the past available to programmers at their fingertips as they write new code or fix old code,"
said Vivek Sarkar, Rice's E.D. Butcher Chair in Engineering, chair of the Department of Computer Science and principal investigator on the PLINY project, in a statement. "You can think of this as autocomplete for code, but in a far more sophisticated way."
Researchers from the University of Texas at Austin, the University of Wisconsin-Madison and the company GrammaTech are also working on PLINY, which will be centered around a data mining engine designed to plow through oodles of open source computer code.
Targeting Second-Order Vulnerabilities
A pair of researchers from Ruhr-Universitat Bochum in Germany last year were awarded the first $50K Internet Defense Prize for their work in combatting "second-order vulnerabilities" in Web apps threats that lurk on Web servers until the time is right to strike.
The researchers, Johannes Dahse and Thorsen Holz, have published a paper titled "Static Detection of Second-Order Vulnerabilites in Web Applications" in greater detail. In it, they describe use of automatic static code analysis to detect vulnerabilities before they inflict their pain on victims. (Second-order vulnerabilities are distinct from first-order threats like SQL injections and cross-site scripting.)
A group of MIT researchers say they've invented a technology that should all but eliminate queue length in data center networking. The technology dubbed Fastpass uses a centralized arbiter to analyze network traffic holistically and make routing decisions based on that analysis, in contrast to the more decentralized protocols common today. Experimentation done in Facebook data centers shows that a Fastpass arbiter with just eight cores can be used to manage a network transmitting 2.2 terabits of data per second, according to the researchers.
Professor Hari Balakrishnan, a co-author of the paper, admitted that this isn't an intuitive solution to the problem of network lag. "It's not obvious that this is a good idea," he said in a statement.
The trick, the researchers said, is a new way of dividing up the processing power needed to calculate transmission timings among multiple cores. In essence, Fastpass organizes workloads by time slot, rather than by source and destination pair. A core gets its own time slot, and schedules requests to the first free servers it can find, passing everything else on to the next core, which follows suit.
(Via Jon Gold, Network World Senior Writer.)
Sniffing out Censorship
Georgia Tech researchers are seeking the assistance of website operators to help better understand which sites are being censored and then figure out how to get around such restricted access by examining the data collected.
The open source Encore [Enabling Lightweight Measurements of Censorship with Cross-Origin Requests] tool involves website operators installing a single line of code onto their sites, and that in turn will allow the researchers to determine whether visitors to these sites are blocked from visiting other sites around the world known to be censored. The researchers are hoping to enlist a mix of small and big websites, and currently it is running on about 10 of them.
The code works in the background after a page is loaded and Georgia Tech's team claims the tool won't slow performance for end users or websites, nor does it track browsing behavior.
"Web censorship is a growing problem affecting users in an increasing number of countries," said Sam Burnett, the Georgia Tech Ph.D. candidate who leads the project, in a statement. "Collecting accurate data about what sites and services are censored will help educate users about its effects and shape future Internet policy discussions surrounding Internet regulation and control."
Keeping an Eye on Excel
University of Massachusetts Amherst researchers have released a tool called CheckCell that's designed to spot errors in Microsoft Excel spreadsheets that could lead to big problems. And as researchers will tell you, most spreadsheets do tend to have errors in them. The researchers say spreadsheet errors can have serious consequences, whether it's messing up a student's grades or leading to erroneous research data becoming accepted as fact.
CheckCell, available as a free Excel software plug-in on GitHub, is a Microsoft Research-funded project.
The researchers' technique for pinpointing Excel errors uses what one member of the team calls "a threshold of unusualness" in which questionable data points are marked for spreadsheet designers to double check. This data debugging approach addresses shortcomings in simple testing and static analysis efforts designed to root out bugs in programs.
Next up is applying the technology to Big Data.
Picture Perfect Computers
University of Illinois at Chicago and University of Hawaii scientists have been awarded $300K in National Science Foundation funding to develop a conversational and interpretive computer that can create easy-to-digest visualizations from data based on natural language requests and common gestures like pointing.
"Today, with big data, you really need to be using visualizations to help you figure out what it is you're looking at," says Andrew Johnson, director of research at UIC's Electronic Visualization Laboratory. "Visualization should be interactive; a dynamic process. We want scientists to be able get ideas out there quickly."
Among other things, the technology would give scientists a tool beyond basic spreadsheets like Excel to graph data. The funded project is based on earlier visualization work out of UIC dubbed Articulate.