Clearly Google is successfully growing it's reach and has ambitions to dominate far more markets. With growth comes responsibility. If their vision becomes reality they will surely attract far more attention from the forces of government regulation.
In the early days of Microsoft the company seemed to appeal to those who wanted an alternative to IBM just as Google appealed to some of those who wanted an alternative to Microsoft. I hear from many quarters in the technical community at large that they are loosing some mind share and goodwill as they grow into more markets.
Perhaps the question should be "can a technology company continue to be perceived as cool as it ceases to be an under-dog?"
Anyone who's studied/experienced the architectural trends of computing will be familiar with the following:
- Let's provide remote access to central computing resources as they are expensive - hence the rise of the data centre and mainframe/mid/mini-computers
- Ah, computing resources have become significantly less expensive so let's distribute computing resources and eventually push them out to the users themselves
- Hold on, how on Earth can we manage the vast number of autonomous islands of computing? - hence the rise of Microsoft's Active Directory
- Telecommunications expenses have become signficantly less expensive and bandwidth has increased at a comparable rate - let's centralise everything once again!
As I mentioned in my last post, cloud centric computing could only be used in isolation if the underlying mobile communications infrastructure reached every location that users could wish to access it AND if the cost and speed of access was acceptable compared to the alternative of a hybrid of software AND services.
As I child I used to love watching a British TV programme (on the BBC) named "Tomorrow's World" as it showed research technologies in a "this is how life could be" context. It was easy to think that technology would change over night - one day we'd wake up and things like GPS navigation systems would be be everywhere for example. In reality such capabilities and devices were picked up by early adopters and over quite a long period of time they became integrated into commodity devices.
I bought my first GPS device in the year 2000 and selected what I considered to be the best device of it's kind. Looking back it was a terrible experience for the following reasons:
- the device was expensive (£500)
- it didn't have enough CPU "horsepower" to be able to update the map display fast enough to show a useful level of detail
- it didn't zoom in and out to show just the right level of detail for the task in hand
- it couldn't route me - I had to define the route by hand on my PC before travelling
- it didn't have enough storage to hold more map detail than 20% or so of the UK
- additional memory was horribly expensive
Having said all of the above I loved the device as it:
- was really cool - no-one else I knew had one
- if I put the effort in to set it up before travelling it was better than trying to read a map while driving (stopping to do so)
- provided a window onto what would be possible at the technology matured
Of course there are some technologies that do hit the market by storm and reach critical mass very quickly. It's going to be interesting to see how computing platforms adapt in the next year to exploit the benefits of "cloud computing" while dealing with the problems of Worldwide mobile communications platforms that are not yet as reliable/inexpensive/fast as we'd like.
Don't count Microsoft out as a significant player in the World of "Cloud computing" - there's a great deal of amazing technology to come...