Artificial intelligence (AI): 5 trends, hype-tested – The Enterprisers Project

Artificial intelligence (AI): 5 trends, hype-tested – The Enterprisers Project

If you are considering using artificial intelligence (AI) to mature your foundational IT and data capabilities, how do you separate hype from reality?

Whether you are exploring the promises of AI for your business or still wondering when you will see truly transformative results, here are five industry trends that will help realize AI’s untapped potential. Let’s break them down:

For most of us, deep learning systems are essentially incomprehensible. Using millions of data points as input and the correlating data as output, their internal logic can generally not be interpreted using plain language.

[ Get our quick-scan primer on 10 key artificial intelligence terms for IT and business leaders: Cheat sheet: AI glossary. ]

However, if automated systems are to assist in making critical decisions such as which operations and processes to use and we cannot understand how these decisions are made, how can we identify and address errors? This lack of common sense (a term first defined in the context of AI by John McCarthy in the 50s), has limited the application of AI in the real world to date. We need a clearer, less complicated AI system that better relates to the world and to people.

We need a clearer, less complicated AI system that better relates to the world and to people.

According to McKinsey Global Institute, by 2030, work hours spent on physical and manual skills and basic cognitive skills are expected to decrease by 14 percent and 15 percent, respectively. We will instead spend more time using higher cognitive skills, like answering “why” and deciding what to do.

This new way of working will lead to a demand for tools to support it. PARC scientist Mark Stefik‘s research on mechagogy (machine teaching) describes a future in which people and machines learn from each other’s strengths. In the future, we can imagine AI systems as an essential part of the workplace – our “thinking” partners.

3. von Neumann computing vs. neuromorphic computing

One of the key disruptions in IT during the next decade will be the transition from traditional von Neumann computing architectures to neuromorphic computing. As Moore’s law slows and we encounter the von Neumann bottleneck, what can we learn from the most efficient computer to date — the brain?

Biological brains have memory and compute in the same circuits, whereas traditional von Neumann digital computers separate memory from compute. Biological brains are highly parallelized, whereas digital computers perform computations in a serial fashion. Biological brains are dense and require only a minuscule fraction of the energy used by a digital computer. These bottlenecks are the primary reason that modern digital computers struggle to process huge AI programs.

Size limitations prevent conventional digital computers from meeting the demands of AI computing. Quantum computers use qubits and parallelism to handle much larger amounts of data and to look at all solutions simultaneously. Incumbents like IBM and Google AI Quantum and startups like Bleximo are working to combine general-purpose processors and NISQ application-specific quantum coprocessors called quantum accelerators to build systems for specific business and engineering domains. Early potential industry applications include chemistry (for materials), pharmaceuticals (for drug design), and finance (for optimization).

[ Read also: Quantum computing: 4 things CIOs should know. ]

5. Electronic vs. brain-machine interface devices

Current AI applications primarily run on electronics, but we’ll eventually see a more intimate integration of electronic and biological systems.

Current AI applications primarily run on electronics, but we’ll eventually see a more intimate integration of electronic and biological systems. For example, Neuralink, one of Elon Musk’s latest ventures, announced plans to start clinical trials of its implantable brain-machine interface (BMI) devices with humans by the end of 2020. With the integration of AI applications and our biological systems, the boundary between humans and machines has begun to merge. Scientists are also combining BMIs and AI to control external devices using brain signals and to recreate aspects of cortical function with AI systems.

Most scientists and technologists agree that we have only scratched the surface of AI’s potential. Increasingly, CIOs and organizations need to keep track of the latest developments of this transformative technology.

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ] 

Source

Leave a Reply

%d bloggers like this: