Microsoft Uses Technology to Tackle Scientific Challenges
Microsoft on Monday launched an expanded push into technical computing that it says is needed to solve ever more complex scientific challenges.
"Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland," Bob Muglia, president of Microsoft's Server and Tools unit, said in a statement.
The software maker said a new team will focus on a number of key technical computing challenges such as shifting high-end computing to the cloud, making it easier to write parallel code, and developing the new tools and software needed for data-intensive modeling tasks.
Beyond solving the world's problems, the new unit is also trying to create a new market for Windows Azure--Microsoft's operating system in the clouds.
"We really believe technical computing is going to be the killer app for the cloud," Microsoft general manager Bill Hilf said in a telephone interview. Of the requisite high-end computers, Hilf said, "they gobble up compute power. They need huge amounts of data."
The effort, which has been quietly coming together over the past 18 months, includes a team of about 500 dedicated staff along with several hundred more from other product teams at the company. The unit will be jointly run by two Microsoft general managers--Hilf and Kyril Faenov--and will be responsible for the high-performance computing version of Windows as well as the new efforts.
For several years now, Microsoft has had a cluster-computing version of Windows Server known these days as Windows HPC Server.
The deeper push into high-end computing was announced in a post on Microsoft's Web site. One of the key efforts will be to develop new kinds of software for scientists, engineers and data analysts. "Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration," Muglia said. "This will allow them to spend more time on their work and less time wrestling with complicated technology."
Another key challenge across the computer industry, but particularly at the high end, has been shifting the way software is written so that tasks can be carried out in parallel by multiple PCs, servers, and processor cores simultaneously.
"Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power," Muglia said. "Parallel programs are extremely difficult to write, test and trouble shoot."
The new effort also aims to see how much high-end work can be shifted to cloud-based set-ups such as Windows Azure. "Existing high-performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing," Muglia said.
Over the last 18 months, the new technical computing unit has worked on both HPC server and adding parallel computing capabilities to Visual Studio. Going forward, the group is looking more broadly. In the coming fiscal year, Hilf said, the company will have demos and beta versions of products that allow developers to write models that can be run in various setups--on a desktop, on a cluster, or in the cloud.
Hilf said that Microsoft is looking at providing both developer tools as well as the broad, horizontal tools that scientists across industries will need.
"We won't be, of course, doing vertical-specific software," Hilf said, noting that Microsoft might do a version of Excel that works with Azure, or build an add-on or even an all new program, but it won't be doing something as specific as, say, a program for the oil and gas industry. "The Schlumbergers of the world will continue to be great partners. Our role with them will be to provide a platform for them."
Return to microsoft news headlines
View Microsoft News Archive