server room or server computers

Computing Power Is Outstripping Our Knowledge

By Joe McKendrick

June 15, 2017

We may have made a lot of advances in technology in recent years, but most businesses still can’t see what’s on the road right in front of them. If anything, “businesses are driving a car using only a rear-view mirror, with an opaque wall instead of a windshield in front.” That’s the observation of Dr. Vishal Sikka, CEO of Infosys. I had the opportunity to sit down with Sikka at his company’s recent Confluence conference in San Francisco.


Supercomputing power is available to al, but are we ready?l  

Supercomputing power is available to all, but are we ready?l

Moore’s Law – which posits that processing power will continue to grow geometrically – apparently has some time left and may go on for decades, Sikka says. However, organizations are focusing too much on speed and capacity, and not enough on the knowledge aspect of all the compute power coming their way. “In corporate planning, organizations have a tough time building out their financial plans,” he remarks. “Their ability to plan and simulate is so incredibly bad. At best, they’re looking at data that’s more than a week old.”

As this inability to grasp the tools and platforms available for predictive analytics opens up opportunities for disruptors, regardless of industry, Sikka continues. “They foresee where the trend is going to go a way to go much butter than big companies can. Corporations’ ability to foresee things, especially exponential changes, is still quite weak.”

Can technology solve this problem, and bring about more forward-looking organizations? Only partially, Sikka says. “Some applications can help, but ultimately people have to become more educated. If you are in an industry that manufactures chairs and tables, you have to figure out how computing is going to transform this. I wonder how long it takes the manufacturer to figure out if customers like back support? They need real-time feedback about what’s going on with the user.”

In every company in every industry “there is need for much more data, much more AI – both for simplifying operations, and for figuring out where they need to go.”

While Sikka observes that he has not seen many larger companies transition to disruptive roles, he says size and heritage shouldn’t matter, if digital disruption is done the right way.

Cloud computing is an important avenue to achieving this, he states. “Cloud computing is already here on a massive scale. It’s enabling a lot of this very elastic, very cheap, very massive-scale computing.” Again, a company doesn’t have to be a startup to take advantage of the disruptive effects of cloud. Sikka notes that his company has about 9,500 projects underway, all facilitated by cloud, which “makes it possible to do innovation and adopt new processes – not only DevOps and Agile and so forth, but also hackathons and reward mechanisms based on shared work.”

While technology is ubiquitous, Sikka notes that contrary to perceptions, that we may still be in the “Dark Ages” of computing literacy. In the original Dark Ages of medieval times, the literacy rate was about 6%. Currently, he observes, the number of people who can program a computer is about a half a percent.

The question is, of course, how essential are programming skills in an era when applications and interfaces are getting relatively simple to use via platforms such as cloud and social media? The emphasis on learning Java or C++ is missing the point, Sikka says. What’s important is building and designing services and capabilities around the way people think. “The computer is a machine’s machine – it has the ability to simulate any other machine,” he says.

If anything, people have the wrong idea of what a “computer” really is. “The iPhone is a computer in the shape of a phone,” he illustrates. A Tesla is “a computer in the shape of a car. Nest is a computer in the shape of a thermostat. It is not simply a thermostat that has some software running inside of it.”

A computer is more than a keyboard, screen, or mouse, Sikka continues. “Being able to program a computer is like thinking about thinking. When people learn to write programs, they learn to write algorithmic thinking. They image what’s in the head what somebody who thinks about this is going to do. That is what computing is really about.”

(Disclosure: Infosys, mentioned in this article, assisted with my travel expenses to Confluence.)


This article was written by Joe McKendrick from Forbes and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to