Personal reflections of an Internet pioneer

As I graduated from Clare in July 1972, something momentous was happening in a remote production plant in Santa Clara, California. This event was about to transform the world in a way that had occurred only in the European Renaissance and the industrial revolution.

1972 was the year that the first eight-bit microprocessor (8008) was manufactured by a start-up company called INTEL. This might well be remembered as the birth date of today’s Digital World. A co-student at Clare, Dr Eni Njoku (1969), and I talked enthusiastically at the time about the prospects of a connected world and how silicon technology could help transform life as we knew it.

My first job was with the research arm of a UK telecommunications company, Plessey. There, I was to design a device that could transmit data at high speed through the public telephone network. By doing so, millions of people could interact through terminals with remote computing resources, interchanging files and emails.

However, in 1974, the UK still had limited opportunities for technical engineers. I jumped at the chance to take up a graduate place at MIT, a hot house of the technological revolution. The fees were a pressing worry, but that changed when MIT was awarded a large grant by ARPA, the Advanced Research Projects Agency, to develop a network that would be infinitely expandable and indestructible.

I can remember the moment when my professor, Robert Gallagher, offered me a Research Fellowship to help architect the proposed ARPAnet. He explained that a new set of protocols would be needed to encode data to enable information to find its way efficiently to its destination. Our task was to examine different possibilities, selecting the most efficient and scalable solution – what in today’s terms has become the IP Protocol of the modern Internet.

Professor Gallagher was himself a remarkable person and a true visionary. A PhD student of Claude E. Channon, the inventor of Information Theory, he was well equipped to tackle such a challenge. We finally opted for packet switching as the core architecture for the new network. In 1977, our joint work was published in the IEEE Transactions. Vint Cerf and Robert Kahn, working at Stanford University, came up with similar conclusions.

Subsequent developments of the ARPAnet are well documented. In 1989, whilst working at Cern in Switzerland, Sir Tim Berners-Lee invented the World Wide Web (WWW). Combined, the ARPAnet and WWW have enabled hyper-connectivity between some three billion people and thirty billion machines. Few could foresee the impact of such an invention, fuelled by the silicon chip, in bringing revolutionary change across the political, economic and social landscapes of both East and West. As the futurist Alvin Toffler predicted in his seminal book, Future Shock, “we were about to experience change in the rate of change”.

Returning to the UK in the late 70s, I occupied myself for several years helping organisations such as BP, Henkel, Philips and Shell to harness the power of new technologies, many associated with the Internet. That was until, in 1993, Stanford Research Institute (or SRI) – the same organisation that had invented the computer mouse and Tide detergent – came calling. A once-in-a-lifetime opportunity presented itself to become their global head of operations, with the aim of putting the institute back on a profitable track.

Our first task was to launch a global study, Business in the Third Millennium, which would help corporate entities navigate the newly connected world. Companies from Asia, Europe and the USA contributed millions of dollars to support a five-year research programme. Our conclusions were uncompromising. We forecast the break-up of large monolithic structures into constituent ‘atoms’, first conjectured by Nobel Laureate Ronald Coase in his famous paper ‘The Nature of the Firm’. We hypothesised that ‘hyper-connectivity’, enabled by the Internet, would be superseded by ‘hyper-personalisation’ and that new digital giants would disrupt the comfortable equilibrium of established sector leaders in areas such as finance, media and retail.

At the same time, we leveraged SRI’s technologies to form innovative companies, such as Nuance Communications, that applied natural language speech recognition to everyday products such as cars, PCs and mobile phones (now called Siri).

In the late nineties, with SRI back in the black, I moved on to the next challenge. EY recruited me to lead its e-commerce practice amidst the dotcom boom. I worked with companies across the globe to envision new digital structures and way of working, the most ambitious of which was a collaborative programme of twelve global leaders, including P&G, AMEX and GE, to build shared solutions to common functional needs such as procurement, CRM, HR and finance. Ahead of our time, we now recognise that cloud-based companies such as Salesforce are emulating our ideas.

And what of the future? My personal view is that the digital revolution is still in its infancy and will continue to fundamentally transform the political, social and economic landscapes over the coming fifty or so years. During this period, we can predict with some certainty that machine intelligence will overtake that of humankind. It will be the Clare millennials and their children rather than the class of 1972 who will be challenged to harness such universal powers.

As Yuval Harari says in his seminal book, Homo Deus, “death is a technical problem that will shortly be solved”.