Cognitive Computing and Artificial Intelligence

Cognitive Computing and Artificial Intelligence

50th in a series of 50 Knowledge Management Components (Slide 64 in KM 102)

Definitions

  • Cognitive computing: simulation of human thought processes in a computerized model, involving self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works; makes a new class of problems computable, addressing complex situations that are characterized by ambiguity and uncertainty

  • Artificial intelligence: the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system

  • Expert systems: computerized systems that emulate the decision-making ability of a human expert; designed to solve complex problems by reasoning about knowledge, represented mainly as if–then rules rather than through conventional procedural code

  • Natural language processing: a branch of artificial intelligence that deals with analyzing, understanding and generating the languages that humans use naturally in order to interface with computers in both written and spoken contexts using natural human languages instead of computer languages

  • Machine learning: giving computers the ability to learn without being explicitly programmed; a method of data analysis that automates analytical model building; using algorithms that iteratively learn from data to find hidden insights without being explicitly programmed where to look

  • Deep learning: part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms

  • Neural networks: computing systems made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs; artificial neural networks (ANNs) are algorithms or actual hardware that are loosely modeled after the neuronal structure of the mammalian cerebral cortex but on much smaller scales

  • Intelligent agents: algorithms that interpret requests and provide responses for specific domains and tasks, and produce unique results to questions that have not been pre-programmed

  • Chatbot: a computer program that conducts a conversation via auditory or textual methods; often designed to convincingly simulate how a human would behave as a conversational partner, thereby passing the Turing test

  • Generative AI (GenAI) : artificial intelligence capable of generating text, images, synthetic data, or other media, using generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. In the ongoing AI boom, advances in transformer-based deep neural networks have enabled a number of generative AI systems notable for accepting natural language prompts as input. These include large language model (LLM) chatbots such as ChatGPT, Copilot, Gemini, and LLaMA, and text-to-image artificial intelligence art systems such as Stable Diffusion, Midjourney, and DALL-E.

  • Large Language Model (LLM):  a language model notable for its ability to achieve general-purpose language generation. LLMs acquire these abilities by learning statistical relationships from text documents during a computationally intensive self-supervised and semi-supervised training process. LLMs are artificial neural networks typically built with a transformer-based architecture. Some recent implementations are based on alternative architectures such as recurrent neural network variants and Mamba (a state space model). LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.

  • Retrieval-Augmented Generation (RAG) : a technique for enhancing the accuracy and reliability of generative AI models with facts fetched from external sources

Background

Cognitive computing and artificial intelligence are very hot topics at the time of this writing. In researching this article, I found a large number of recent articles on these and related topics. Many of the writers seem to think that AI is a relatively new technology. It is not. AI has been around for a long time, having gone through several hype cycles.

Awareness of AI is widespread. Movies such as 2001: A Space Odyssey (1968), Colossus: The Forbin Project (1970), and A.I. Artificial Intelligence (2001) have portrayed AI somewhat realistically.

In 1983, Janet Johnson, my colleague at Digital Equipment Corporation, attended extended, intensive training to become an AI specialist. When she returned, we were able to place her at McDonnell Douglas in an expert systems project. But nothing came of it, and before long, she was working on other efforts that did not take advantage of her AI expertise.

The promise of AI has always been tantalizing, but it has struggled to deliver on that promise. They key is to create, promote, and implement cognitive computing killer apps that are markedly more capable than existing alternatives. In a television commercial for Amazon Echo, Alec Baldwin asks Alexa to check the traffic. This is not likely to convince many people to buy an Echo. More compelling use cases for AI are needed.

One of the early areas where AI was applied was expert systems for medicine. AI can help diagnose illnesses, prevent problems with drug interactions, and detect new associations not yet seen by humans. My radiologist friend, Dr. David Osher, told me that computers will soon be able to read most x-rays better than he can. Now that is a compelling use case.

Uses and Benefits

Cognitive computing and artificial intelligence can be used to support decision-making, deliver highly-relevant information, and optimize the available attention to avoid missing key developments. Implement cognitive computing to help achieve these desirable results:

  • Help people make better decisions, take action more quickly, and achieve more successful outcomes.

  • Deliver relevant information and advice at the time of need.

  • Reduce information overload and optimize people’s available attention span.

  • Allow people to act more efficiently and effectively.

  • Reduce errors, minimize loss and damage, and improve health and safety.

Cognitive computing can simulate human thought processes and mimic the way the human brain works, addressing complex situations that are characterized by ambiguity and uncertainty. Artificial intelligence can perform operations analogous to learning and decision making in humans. Intelligent personal assistants can recognize voice commands and queries, respond with information, or take desired actions quickly, efficiently, and effectively.

Using these approaches can enhance the capabilities of humans by augmenting their powers of observation, analysis, decision making, processing, and responding to other people and to routine or challenging situations. Cognitive computing tools such as IBM Watson, artificial intelligence tools such as expert systems, and intelligent personal assistant tools such as Amazon Echo, Apple Siri, Google Assistant, and Microsoft Cortana can be used to extend the ability of humans to understand, decide, act, learn, and avoid problems.

Cognitive computing and artificial intelligence are the key elements of an Augment Strategy for knowledge management. Here are three examples of such a strategy:

  1. Non-Profit Organization: provide chatbots and voice recognition for contacting and responding to donors

  2. Manufacturing Company: implement expert systems for designing, engineering, and building new products.

  3. Consulting Firm: automatically determine the specialties, roles, and interests of consultants and automatically deliver important information relevant to their work at the time of need.

Insights

1. Primer: Make sense of cognitive computing by Bob Violino – quotes Paul Roma of Deloitte

There are three main ways cognitive computing can be applied today:

  • Robotic and cognitive automation to automate repeatable tasks to improve efficiency, quality, and accuracy.

  • Cognitive insights to uncover hidden patterns and relationships to identify new opportunities for innovation.

  • Cognitive engagement to drive customer actions by delivering hyperpersonalization at scale.

How is cognitive computing different from AI?

Deloitte refers to cognitive computing as “more encompassing than the traditional, narrow view of AI.” AI has been primarily used to describe technologies capable of performing tasks normally requiring human intelligence, he says. “We see cognitive computing as being defined by machine intelligence, which is a collection of algorithmic capabilities that can augment employee performance, automate increasingly complex workloads, and develop cognitive agents that simulate both human thinking and engagement.”

2. The First Wave of Corporate AI Is Doomed to Fail by Kartik Hosanagar and Apoorv Saxena

Early efforts of companies developing chatbots for Facebook’s Messenger platform saw 70% failure rates in handling user requests. Yet a reversal on these initiatives among large companies would be a mistake. The potential of AI to transform industries truly is enormous. Recent research from McKinsey Global Institute found that 45% of work activities could potentially be automated by today’s technologies, and 80% of that is enabled by machine learning. The report also highlighted that companies across many sectors, such as manufacturing and health care, have captured less than 30% of the potential from their data and analytics investments. Early failures are often used to slow or completely end these investments.

For quick wins, one might focus on changing internal employee touchpoints, using recent advances in speech, vision, and language understanding. Examples of these projects might be a voice interface to help pharmacists look up substitute drugs, or a tool to schedule internal meetings. These are areas in which recently available, off-the-shelf AI tools, such as Google’s Cloud Speech API and Nuance’s speech recognition API, can be used, and they don’t require massive investment in training and hiring. (Disclosure: One of us is an executive at Alphabet Inc., the parent company of Google.) They will not be transformational, but they will help build consensus on the potential of AI. Such projects also help organizations gain experience with large-scale data gathering, processing, and labeling, skills that companies must have before embarking on more-ambitious AI projects.

For long-term projects, one might go beyond point optimization, to rethinking end-to-end processes, which is the area in which companies are likely to see the greatest impact. For example, an insurer could take a business process such as claims processing and automate it entirely, using speech and vision understanding. Allstate car insurance already allows users to take photos of auto damage and settle their claims on a mobile app. Technology that’s been trained on photos from past claims can accurately estimate the extent of the damage and automate the whole process. As companies such as Google have learned, building such high-value workflow automation requires not just off-the-shelf technology but also organizational skills in training machine learning algorithms.

3. How Companies Are Already Using AI by Satya Ramaswamy

  • Put AI to work on activities that have an immediate impact on revenue and cost.

  • Look for opportunities in which AI could help you produce more products with the same number of people you have today.

  • Start in the back office, not the front office.

4. How People Will Use AI to Do Their Jobs Better by H. James Wilson and Cyrille Bataller

  • Digital assistants you can trust

  • Software that makes experts better at what they do

  • Enabling innovation

5. The War on Experts by Gary Klein

Artificial Intelligence and Big Data have each claimed to be able to replace experts. However, each of these claims is unwarranted. Let’s start with AI. Smart systems should be able to do things like weather forecasting better (and more cheaply) than humans, but the statistics show that human forecasters improve the machine predictions by about 25%, an effect that has stayed constant over time. AI successes have been in games like chess, Go, and Jeopardy — games that are well-structured, with unambiguous referents and definitive correct solutions. But decision makers face wicked problems with unclear goals in ambiguous and dynamic situations, conditions that are beyond AI systems.

As Ben Shneiderman and I observed in a previous essay, humans are capable of frontier thinking, social engagement, and responsibility for actions. Big Data approaches can search through far more records and sensor inputs than any human, but these algorithms are susceptible to seeing patterns where none really exist. Google’s FluTrends project was publicized as a success story, but subsequently failed so badly that it was removed from use. Big Data algorithms follow historical trends, but may miss departures from these trends. Further, experts can use their expectancies to spot missing events that may be very important, but Big Data approaches are unaware of the absence of data and events.

6. Steer Clear of the Hype: 5 AI Myths by Christy Pettey

  1. Buy an AI to solve your problems

  2. Everyone needs an AI strategy or a chief AI officer

  3. Artificial intelligence is real

  4. AI technologies define their own goals

  5. AI has human characteristics

7. Micro Explanations For Nine Essential AI Technologies by Mike Gualtieri

  1. Knowledge engineering

  2. Robotics

  3. Speech recognition

  4. Natural language processing

  5. Natural language generation (NLG)

  6. Image analysis

  7. Machine learning

  8. Deep learning

  9. Sensory perception

8. 12 Ways AI Will Disrupt Your C-Suite by Lisa Morgan

  1. Organizational Intelligence Explodes

  2. First-Mover Advantage

  3. Employees May Lead The Charge

  4. Organizational Structures Will Shift

  5. AI Requires Context

  6. Organizations Have To Adapt

  7. It's Not All About Technology

  8. Employee Empowerment Is Necessary

  9. Learn By Doing

  10. Expect The Unexpected

  11. Pay Attention To Possibilities

  12. Change Is At Hand

Resources

1.LinkedIn Topics: Cognitive Computing - Artificial Intelligence

2. SlideShare: Cognitive Computing - Artificial Intelligence

3. Association for the Advancement of Artificial Intelligence

4. Cognitive Computing Consortium (archives)

5. The MIND Project

6. (the late) Sue Feldman

6. Hadley Reynolds

7. Tom Davenport

8. Seth Earley

9. Adi Gaskell

10. IBM

11. APQC

12. KMWorld

13. Rolling Stone: Inside the Artificial Intelligence Revolution: A Special Report

14. RealKM Magazine: AI

15. Chatbots Magazine

16. Information Week

17. CNET: AI

18. Psychology Today: AI

19. InfoWorld: AI

20. SAS: AI

21. Forbes: AI

22. Harvard Business Review: AI

23. MIT: AI

24. SIKM Leaders Community

25. Gartner: AI

26. Forrester: AI

27. Deloitte

28. David Schatsky

29. Gary Klein

30. Tony Rhem - The Connection between Artificial Intelligence & Knowledge Management

31. Mary Abraham

32. Matt Moore

33. Dave Snowden

34. Luis Suarez

35. Laurence Lock Lee

36. Futurism: AI

37. What deep learning really means by Martin Heller

38. The incomplete A-Z of cognitive computing by BBC

39. The Growing Importance of Natural Language Processing by Stephen F. DeAngelis

40. What’s the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning? by Michael Copeland

41. Our Machines Now Have Knowledge We’ll Never Understand by David Weinberger

42. The Definitive Guide to Natural Language Processing

43. A Basic Introduction To Neural Networks

44. 5 ways chatbots are revolutionizing knowledge management by Matt Wade

45. AI Use Cases That Will Advance the Industries in 2021 by Ivana Kotorchevikj

Software

1. Artificial Intelligence Software by Capterra

2. Top Cognitive Computing Companies by Venture Radar

3. Top Cognitive Computing Companies by Predictive Analytics Today

4. G2

5. Liam Hänel

Generative AI Tools

  1. Bing

  2. ChatGPT

  3. ChatPDF

  4. Claude

  5. Gemini

  6. Pi

Training

1. Tutorials Point

2. Udacity

3. Deloitte

4. edX

5. Coursera

6. Machine Learning Courses

7. Ruben Hassid’s List

8. Steve Nouri's List

Books

  1. AI and Machine Learning

  2. Generative AI

  3. Large Language Models

  4. ChatGPT

  5. Neural Networks and Deep Learning by Michael Nielsen

  6. Deep Learning by Ian Goodfellow. Yoshua Bengio, and Aaron Courville

  7. Cognitive Computing and Big Data Analytics by Judith Hurwitz and Marcia Kaufman

  8. Cognitive Computing: A Brief Guide for Game Changers by Peter Fingar

  9. Smart Machines: IBM's Watson and the Era of Cognitive Computing by John Kelly III and Steve Hamm

  10. Big-Data Analytics for Cloud, IoT and Cognitive Computing by Kai Hwang and Min Chen

  11. Cognitive Computing: Theory and Applications by Venu Govindaraju and Venkat N. Gudivada

  12. Books by Mark Watson

  13. Cognitive (Internet of) Things: Collaboration to Optimize Action by Arvind Sathi

  14. The Sentient Machine: The Coming Age of Artificial Intelligence by Amir Husain

  15. Applications of Cognitive Computing Systems and IBM Watson Edited by Aaditya Telang

  16. Machine Learning for Decision Makers: Cognitive Computing Fundamentals for Better Decision Making by Patanjali Kashyap

  17. The 2021–2026 World Outlook for Cognitive Computing by Philip M. Parker

  18. Advances in Cognitive Informatics and Cognitive Computing by Yingxu Wang and Du Zhang

  19. The AI-Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster, and More Profitable by Seth Earley

  20. The AI Advantage: How to Put the Artificial Intelligence Revolution to Work by Tom Davenport

Video

Sahat P Hutagalung

with sharing and discusion to elavate the knowledge

5y

Good article Stan Garfield about AI an Cognictive learning which are base on simulation model programming.Now the KM be radically change due the technologi of IT and also OT including analytic process.May the digital twin technology be a tool of knowledge capture and transfer? So the IICT and ITOT and neuro logic make it more radical change of KM and KM orocess.

Like
Reply
Charles Warnock

Content Marketer | 2X Author | Wealth Management | Sustainable Growth

6y

Thanks Stan Garfield. Great post for people exploring the potential of AI and related technologies.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics