Thursday, January 03, 2019

TECHNOLOGY: Google Wins U.S. Approval for Radar-Based Hand Motion Sensor

Via Reuters:
U.S. regulators have approved Google's deployment of a radar-based motion sensor, granting it a waiver to use the device at higher power levels than currently permitted. The U.S. Federal Communications Commission (FCC) said the Project Soli device "will serve the public interest by providing for innovative device control features using touchless hand gesture technology." According to the FCC, the sensor captures motion in a three-dimensional space using a radar beam to facilitate touchless control of functions or features that can benefit users with mobility or speech impediments. Google said the sensor enables users to press an invisible button between the thumb and index fingers, or a virtual dial that turns by rubbing the thumb against the index finger. Said Google, "Even though these controls are virtual, the interactions feel physical and responsive" as feedback is produced by the haptic sensation of fingers touching.

Thursday, August 11, 2016

INNOVATION: World’s First Parallel Computer Based on Biomolecular Motor

And now, news from Germany.

A new parallel-computing approach can solve combinatorial problems, according to a study published in Proceedings of the National Academy of Sciences. Researchers from the Max Planck Institute of Molecular Cell Biology and Genetics and the Dresden University of Technology collaborated with an international team on the technology. The researchers note significant advances have been made in conventional electronic computers in the past decades, but their sequential nature prevents them from solving problems of a combinatorial nature. The number of calculations required to solve such problems grows with the size of the problem, making them intractable for sequential computing. The new approach addresses these issues by combining well-established nanofabrication technology with molecular motors that are very energy-efficient and inherently work in parallel. The researchers demonstrated the parallel-computing approach on a benchmark combinatorial problem that is very difficult to solve with sequential computers. The team says the approach is scalable, error-tolerant, and dramatically improves the time to solve combinatorial problems of size N. The problem to be solved is "encoded" within a network of nanoscale channels by both mathematically designing a geometrical network that is capable of representing the problem, and by fabricating a physical network based on this design using lithography. The network is then explored in parallel by many protein filaments self-propelled by a molecular layer of motor proteins covering the bottom of the channels.

Saturday, July 02, 2016

INNOVATION: Computers read 1.8 billion words of fiction to learn how to anticipate human behaviour

Meanwhile at Stanford:

Researchers at Stanford University are using 600,000 fictional stories to inform their new knowledge base called Augur. The team considers the approach to be an easier, more affordable, and more effective way to train computers to understand and anticipate human behavior. Augur is designed to power vector machines in making predictions about what an individual user might be about to do, or want to do next. The system's current success rate is 71 percent for unsupervised predictions of what a user will do next, and 96 percent for recall, or identification of human events. The researchers report dramatic stories can introduce comical errors into a machine-based prediction system. "While we tend to think about stories in terms of the dramatic and unusual events that shape their plots, stories are also filled with prosaic information about how we navigate and react to our everyday surroundings," they say. The researchers note artificial intelligence will need to put scenes and objects into an appropriate context. They say crowdsourcing or similar user-feedback systems will likely be needed to amend some of the more dramatic associations certain objects or situations might inspire.

Friday, April 15, 2016

TECHNOLOGY: How Can Supercomputers Survive a Drought?

Via HPCwire.com:

Water scarcity has been surfacing as an extremely critical issue worth addressing in the U.S. as well as around the globe nowadays. A McKinsey-led report shows that, by 2030, the global water demand is expected to exceed the supply by 40%. According to another recent report by The Congressional Research Service (CRS), more than 70% of the land area in the U.S. underwent drought condition during August, 2012.

When it comes to 2014, the condition has become even worse in some of the states: following a three-year dry period, California declared state-wide drought emergency. A report by NBC News on this drought quotes California Gov. Jerry Brown as saying, “perhaps the worst drought California has ever seen since records began being kept about 100 years ago”. Many such evidences of extended droughts and water scarcity have undoubtedly necessitated concerted approaches to tackling the global crisis and ensuring water sustainability.

Supercomputers are notorious for consuming a significant amount of electricity, but a less-known fact is that supercomputers are also extremely “thirsty” and consume a huge amount of water to cool down servers through cooling towers that are typically located on the roof of supercomputer facilities. While high-density servers packed in a supercomputer center can save space and/or costs, they also generate a large amount of heat which, if not properly removed, could damage the equipment and result in huge economic losses.

The high heat capacity makes water an ideal and energy-efficient medium to reject server heat into the environment through evaporation, an old yet effective cooling mechanism. According to Amazon’s James Hamilton, a 15MW data center could guzzle up to 360,000 gallons of water per day. The U.S. National Security Agency’s data center in Utah would require up to 1.7 million gallons of water per day, enough to satiate over 10,000 households’ water needs.

Although water consumption is related to energy consumption, they also differ from each other: due to time-varying water efficiency resulting from volatile outside temperatures, the same amount of server energy but consumed at different times may also result in different amount of water evaporation in cooling towers. In addition to onsite cooling towers, the enormous appetite for electricity also holds supercomputers accountable for offsite water consumption embedded in electricity production. As a matter of fact, electricity production accounts for the largest water withdrawal among all sectors in the U.S. While not all the water withdrawal is consumed or “lost” via evaporation, the national average water consumption for just one kWh electricity still reaches 1.8L/kWh, even excluding hydropower which itself is a huge water consumer.

Friday, April 08, 2016

LANGUAGE: some comments on the proposal

This is a re-post of one of the posts from the Ask The Delphic Oracle blog (Link: http://askthedelphicoracle.blogspot.com). I had a written up a proposal for better orthography for the Tamil language. I just wanted to add some explanatory comments. Here they are.

-+-

Some comments on the proposal:

(1) Very short learning curve: It should take maybe a couple of minutes for someone who knows both Hindi and Tamil to learn the schema.

(2) Obvious interpretation: It ought to be obvious even to someone who has never seen the system to interpret the letters in the new system.

(3) Compliant (technologically) with multiple platforms: this system is very easy to type with on a standard Tamil keyboard. It is also very easy to type with when working with web applications like Quillpad.

(4) Helpful for minorities: it is very helpful for minority groups; for instance, to be able to avail of social services such as getting prescriptions filled appropriately. (Rather than mess up the name of a medicine; one could write it without loss of fidelity in this system).

-+-

Wednesday, March 30, 2016

TECHNOLOGY: Four DARPA projects that could be bigger than the Internet

From DefenseOne:

Forty years ago, a group of researchers with military money set out to test the wacky idea of making computers talk to one another in a new way, using digital information packets that could be traded among multiple machines rather than telephonic, point-to-point circuit relays. The project, called ARPANET, went on to fundamentally change life on Earth under its more common name, the Internet.

Today, the agency that bankrolled the Internet is called the Defense Advanced Research Projects Agency, or DARPA, which boasts a rising budget of nearly $3 billion split across 250 programs. They all have national security implications but, like the Internet, much of what DARPA funds can be commercialized, spread and potentially change civilian life in big ways that its originators didn’t conceive.

What’s DARPA working on lately that that could be Internet big?

Sunday, March 20, 2016

INNOVATION: Mathematical model to explain how things go viral

Interesting research on virality. At the University of Aberdeen:

A University of Aberdeen-led research team has developed a model that explains how things go viral in social networks, and it includes the impact of friends and acquaintances in the sudden spread of new ideas. "Mathematical models proposed in the past typically neglected the synergistic effects of acquaintances and were unable to explain explosive contagion, but we show that these effects are ultimately responsible for whether something catches on quickly," says University of Aberdeen researcher Francisco Perez-Reche. The model shows people's opposition to accepting a new idea acts as a barrier to large contagion, until the transmission of the phenomenon becomes strong enough to overcome that reluctance. Although social media makes the explosive contagion phenomenon more apparent in everyday life than ever before, it is the intrinsic value of the idea or product, and whether friends and acquaintances adopt it or not, which remains the crucial factor. The model potentially could be used to address social issues, or by companies to give their product an edge over competitors. "Our conclusions rely on numerical simulations and analytical calculations for a variety of contagion models, and we anticipate that the new understanding provided by our study will have important implications in real social scenarios," Perez-Reche says.

Wednesday, February 03, 2016

INNOVATION: Memory cells built on paper

.

Via IEEE.org:
A team based at the National Taiwan University in Taipei has used a combination of inkjet and screen printing to make small resistive RAM memory cells on paper. These are the first paper-based, nonvolatile memory devices, the team says (nonvolatile means that the device saves its data even when it's powered down).  
As Andrew Steckl outlined in his feature for IEEE Spectrum last year, paper has a lot of potential as a flexible material for printed electronics. The material is less expensive than other flexible materials, such as plastic. It boasts natural wicking properties that can be used to draw fluids into sensors. And it can be easily disposed of by shredding or burning.

Tuesday, January 26, 2016

TECHNOLOGY: Scheduling algorithms based on game theory makes better use of computational resources

Via Phys.org:

Rubing Duan and Xiaorong Li at the A*STAR Institute of High Performance Computing in Singapore and co-workers have now developed a scheme to address the scheduling problem in two large-scale applications: the ASTRO program from the field of cosmology, which simulates the movements and interactions of galaxy clusters, and the WIEK2k program from the field of theoretical chemistry, which calculates the electronic structure of solids1. The researchers' new scheme relies on three game-theory-based scheduling algorithms: one to minimize the execution time; one to reduce the economic cost; and one to limit the storage requirements.

The researchers performed calculations wherein they stopped the competition for resources when the iteration reached the upper limit of optimization. They compared their simulation results with those from related algorithms—namely, Minimum Execution Time, Minimum Completion Time, Opportunistic Load Balancing, Max-min, Min-min and Sufferage. The new approach showed improvements in terms of speed, cost, scheduling results and fairness. Furthermore, the researchers found that the execution time improved as the scale of the experiment increased. In one case, their approach delivered results within 0.3 seconds while other algorithms needed several hours.

Sunday, November 29, 2015

INNOVATION: Is There a Crisis in Computer-Science Education?

From the Chronicle of Higher Education:

Furthermore, to focus only on computer-science majors misses a larger point. As Ms. Raja argues in her essay, simply teaching kids how to code shouldn’t be the only goal. Just as important—or perhaps more so—is teaching kids how to think like a computer programmer—what is called “computational thinking.” She highlights some current efforts to teach computational thinking in elementary and secondary schools, particularly to girls and members of minority groups, who remain woefully underrepresented among computer-science degree-holders and professional computer programmers.

And while teaching computational thinking may result in more computer-science degrees, the more important contribution it will make is giving more people across all fields the ability to solve problems like a computer scientist and to speak the language of computer programming.

As Ms. Raja notes, those are skills everyone should have access to, regardless of their major.