Data Informed

Data Informed


Vint Cerf Envisions Interoperable Clouds Driving Collaborative Computing, Analytics

January 17, 2013
Vint Cerf Envisions Interoperable Clouds Driving Collaborative Computing, Analytics





GAITHERSBURG, Md. – Vint Cerf, celebrated “father of the Internet†now working at Google as vice president and chief Internet evangelist, sees parallels between pre-Internet networking and the state of cloud computing today.

In a lively keynote speech January 16 to more than 600 researchers from government, academia and industry at the National Institute of Standards and Technology’s Cloud Computing and Big Data Workshop here, Cerf said that today users interact with cloud computing in a client-server model – information flows between individual users (clients) and the data center (server).  The next step in cloud evolution may be clouds interacting directly with each other.  He believes that distinct types of specialized clouds (such as for data analysis or large-scale computing) are developing, and there will be a need for a way to transfer data among them without having to download it from one cloud and upload it onto another.

It’s similar to the proprietary networks of 30 years ago, which allowed communications among only computers of the same brand. Then Cerf and Robert Kahn designed TCP/IP, a non-proprietary protocol that tied them together and led to the Internet.

“I have the sense that inter-cloud communication is in the same state of infancy that Internet was in the ‘70s,†Cerf said. “We need something like [TCP/IP] in the cloud environment.â€

Related Stories


U.S. CIO talks of opening more government datasets to spur innovation.


Read more»


1000 Genomes Project shows potential of cloud-based collaboration.


Read more»


Cloud Security Alliance lists 10 big data security challenges.


Read more»




That was just one of a litany of ideas Cerf suggested researchers focus on in the workshop. Others included:

Security of data in the cloud. Cerf noted that the security of data in the cloud continues to be a concern. Researchers should investigate new methods of authentication, including different types of authentication for different purposes.

The resiliency of could computing networks. He asked the researchers to consider loosely coupling systems, rather than a rigid structure, in building clouds and networks. Just like buildings are designed to bend and sway to survive earthquakes, loosely coupled systems are more flexible and thus less likely to break.  In fact, that’s another way cloud computing should follow the Internet’s example. He suggested that “having hundreds of thousands of cloud systems operating independently of each other – sharing some information for connectivity, but not sharing what’s going on inside each network†may not be optimally efficient but it will be resilient. “It’s remarkable how resilient the Internet has been now for 30 years.â€

Accelerating data transmission speeds.  Clouds need higher transmission speeds, both at the backbone level and within the data center itself. Cerf said Google has replaced the routers in its centers with new machines based on an open source technology called Open Flow. The routers are operating at rates of 100 gigabits per second and up, he said. “At some point we will require terabit per second speeds for these processing systems to communicate with each other, along with a comparable increase in speeds of the backbone that links the data centers together,†he added.

Experimenting with large-scale collaborations. Cloud computing is a shared platform that opens up new vistas for collaboration, allowing multiple parties to simultaneously access the same work in an application in real time. Cerf offered the example of multiple people editing a document. Rather than the confusing and laborious process of emailing various versions of the document to several people, the cloud enables all of them to see, discuss and edit the document at the same time. Think of what this might mean for researchers all viewing the same visualization of a large-scale database, he said.

The future of large-scale computing and analytics.  Cerf suggested that Bayesian analysis might be the next wave in cloud computing. He participated in a symposium with Judea Pearl, a renowned computer scientist and winner of the 2011 Turing Award, who developed the concept of networks based on Bayesian analysis, a model that represents a set of random variables and their conditional dependencies in graphical form. As he watched Pearl drawing diagrams to explain the causal calculus associated with such networks, Cerf had an epiphany. “I got very excited about the ability of taking very large amounts of data and trying to understand it in terms of Bayesian analysis,†he said. “This suggests to me that we have just barely explored some of the functional capabilities of the large scale computing and memory capability that we have invented so far.â€

Cerf encouraged the researchers to push the boundaries of cloud computing, and especially to learn from what doesn’t work.  He noted that Google collects massive amounts of data on its operations, including every failure. “We have constant measurement of how many error messages are being sent back out to the users, what kinds of the failures the software is experiencing, what processes are dying unexpectedly. . . and then that all gets analyzed.†In research and development, he said, there “cannot be any better teacher†than failure.

Tam Harbert is a freelance writer based in Washington, D.C. She can be contacted through her website.

The post Vint Cerf Envisions Interoperable Clouds Driving Collaborative Computing, Analytics appeared first on Data Informed.


DataInformed?d=yIl2AUoC8zA DataInformed?i=o3LKfQ5hpy8:2tOq_GvvsOo:F7zBnMyn0Lo DataInformed?d=7Q72WNTAKBA DataInformed?i=o3LKfQ5hpy8:2tOq_GvvsOo:V_sGLiPBpWU DataInformed?d=qj6IDK7rITs DataInformed?d=l6gmwiTKsz0 DataInformed?i=o3LKfQ5hpy8:2tOq_GvvsOo:gIN9vFwOqvQ DataInformed?d=TzevzKxY174 DataInformed?i=o3LKfQ5hpy8:2tOq_GvvsOo:KwTdNBX3Jqk
o3LKfQ5hpy8