Trends in cloud computing

by Sebastien Mirolo on Mon, 24 Dec 2012

There has been a lot of companies that are started with a virtual server in the cloud. It is hugely convenient because you can be up and running in five minutes. There is no need to buy a physical machine, rent space in a data center, cable the machine into a rack, etc.

Most web startups are tracking users-per-server metrics. When the load on their infrastructure is too high, they spawn a new virtual server. In this model, the economics of rental versus ownership are still debatable, even for a small business.

Technical science startups have fundamentally different economics. They deal with compute intensive problems. For example, when you are paying millions of dollars to manufacture a new generation of processor (the heart of all computers), you want to make sure the design is flawless before it hits the factory. That means running lots of verilog simulations using lots of machines.

You do not start to solve compute intensive problems without a dozen of machines. Technical science requires a cloud, private or public. Thus the question for technical science startups is "How many machines can I dedicate to a single user at any time?" The higher the better. Before the advance of cloud computing, it meant that if you were to start a business related to compute intensive problems, you had to invest a huge amount of cash up-front to buy the hardware and setup the network, then spend a significant amount more of cash to maintain and manage the IT infrastructure. Obviously this was not something that could be started on a week-end in a garage.

In the presence of big public cloud providers (ex. Amazon, Rackspace), we have seen the development of application providers (ex. Google App Engine, heroku, Redhat openshift). Today we see the emergence of application cluster services, like us, that will enable tinkerers to access tools and compute power that were beyond their reach until now.

Fortylines focuses on verilog simulations for microprocessor debugging. Others have also found a way to leverage the public cloud to build a business around compute intensive problems. For example, rescale is about fluid dynamics simulations (aerospace, automotive, oil & gas, and life sciences industries); Upverter is about making board electronics.

Patents

I attended a Cloud Governance | Privacy Compliance meetup at the beginning of December. There were interesting presentations and an interesting debate around focused around The Stored Communications Act and the third party doctrine.

What was the most interesting for me was Jonathan Blavin arguing that the public cloud decreases the need to patent, instead cloud services will focus on trade secrets of their infrastructure and emphasis copyright protection of their web/mobile interface. The argument goes that in a fast moving technology such as cloud computing, filing patents requires disclosure which can help competitors leap frog the patent applicant. At the same time a cloud service is a black box that cannot be taken apart. The only thing a competitor can reasonably access is the user interface and APIs (Application Programming Interface) description. Thus the emphasis on copyright.

by Sebastien Mirolo on Mon, 24 Dec 2012