Cognitive Computing and Artificial Intelligence
50th in a series of 50 Knowledge Management Components (Slide 64 in KM 102)
Definitions
Cognitive computing: simulation of human thought processes in a computerized model, involving self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works; makes a new class of problems computable, addressing complex situations that are characterized by ambiguity and uncertainty
Artificial intelligence: the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system
Expert systems: computerized systems that emulate the decision-making ability of a human expert; designed to solve complex problems by reasoning about knowledge, represented mainly as if–then rules rather than through conventional procedural code
Natural language processing: a branch of artificial intelligence that deals with analyzing, understanding and generating the languages that humans use naturally in order to interface with computers in both written and spoken contexts using natural human languages instead of computer languages
Machine learning: giving computers the ability to learn without being explicitly programmed; a method of data analysis that automates analytical model building; using algorithms that iteratively learn from data to find hidden insights without being explicitly programmed where to look
Deep learning: part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms
Neural networks: computing systems made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs; artificial neural networks (ANNs) are algorithms or actual hardware that are loosely modeled after the neuronal structure of the mammalian cerebral cortex but on much smaller scales
Intelligent agents: algorithms that interpret requests and provide responses for specific domains and tasks, and produce unique results to questions that have not been pre-programmed
Chatbot: a computer program that conducts a conversation via auditory or textual methods; often designed to convincingly simulate how a human would behave as a conversational partner, thereby passing the Turing test
Generative AI (GenAI) : artificial intelligence capable of generating text, images, synthetic data, or other media, using generative models. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics. In the ongoing AI boom, advances in transformer-based deep neural networks have enabled a number of generative AI systems notable for accepting natural language prompts as input. These include large language model (LLM) chatbots such as ChatGPT, Copilot, Gemini, and LLaMA, and text-to-image artificial intelligence art systems such as Stable Diffusion, Midjourney, and DALL-E.
Large Language Model (LLM): a language model notable for its ability to achieve general-purpose language generation. LLMs acquire these abilities by learning statistical relationships from text documents during a computationally intensive self-supervised and semi-supervised training process. LLMs are artificial neural networks typically built with a transformer-based architecture. Some recent implementations are based on alternative architectures such as recurrent neural network variants and Mamba (a state space model). LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.
Retrieval-Augmented Generation (RAG) : a technique for enhancing the accuracy and reliability of generative AI models with facts fetched from external sources
Background
Cognitive computing and artificial intelligence are very hot topics at the time of this writing. In researching this article, I found a large number of recent articles on these and related topics. Many of the writers seem to think that AI is a relatively new technology. It is not. AI has been around for a long time, having gone through several hype cycles.
Awareness of AI is widespread. Movies such as 2001: A Space Odyssey (1968), Colossus: The Forbin Project (1970), and A.I. Artificial Intelligence (2001) have portrayed AI somewhat realistically.
In 1983, Janet Johnson, my colleague at Digital Equipment Corporation, attended extended, intensive training to become an AI specialist. When she returned, we were able to place her at McDonnell Douglas in an expert systems project. But nothing came of it, and before long, she was working on other efforts that did not take advantage of her AI expertise.
The promise of AI has always been tantalizing, but it has struggled to deliver on that promise. They key is to create, promote, and implement cognitive computing killer apps that are markedly more capable than existing alternatives. In a television commercial for Amazon Echo, Alec Baldwin asks Alexa to check the traffic. This is not likely to convince many people to buy an Echo. More compelling use cases for AI are needed.
One of the early areas where AI was applied was expert systems for medicine. AI can help diagnose illnesses, prevent problems with drug interactions, and detect new associations not yet seen by humans. My radiologist friend, Dr. David Osher, told me that computers will soon be able to read most x-rays better than he can. Now that is a compelling use case.
Uses and Benefits
Cognitive computing and artificial intelligence can be used to support decision-making, deliver highly-relevant information, and optimize the available attention to avoid missing key developments. Implement cognitive computing to help achieve these desirable results:
Help people make better decisions, take action more quickly, and achieve more successful outcomes.
Deliver relevant information and advice at the time of need.
Reduce information overload and optimize people’s available attention span.
Allow people to act more efficiently and effectively.
Reduce errors, minimize loss and damage, and improve health and safety.
Cognitive computing can simulate human thought processes and mimic the way the human brain works, addressing complex situations that are characterized by ambiguity and uncertainty. Artificial intelligence can perform operations analogous to learning and decision making in humans. Intelligent personal assistants can recognize voice commands and queries, respond with information, or take desired actions quickly, efficiently, and effectively.
Using these approaches can enhance the capabilities of humans by augmenting their powers of observation, analysis, decision making, processing, and responding to other people and to routine or challenging situations. Cognitive computing tools such as IBM Watson, artificial intelligence tools such as expert systems, and intelligent personal assistant tools such as Amazon Echo, Apple Siri, Google Assistant, and Microsoft Cortana can be used to extend the ability of humans to understand, decide, act, learn, and avoid problems.
Cognitive computing and artificial intelligence are the key elements of an Augment Strategy for knowledge management. Here are three examples of such a strategy:
Non-Profit Organization: provide chatbots and voice recognition for contacting and responding to donors
Manufacturing Company: implement expert systems for designing, engineering, and building new products.
Consulting Firm: automatically determine the specialties, roles, and interests of consultants and automatically deliver important information relevant to their work at the time of need.
Insights
1. Primer: Make sense of cognitive computing by Bob Violino – quotes Paul Roma of Deloitte
There are three main ways cognitive computing can be applied today:
Robotic and cognitive automation to automate repeatable tasks to improve efficiency, quality, and accuracy.
Cognitive insights to uncover hidden patterns and relationships to identify new opportunities for innovation.
Cognitive engagement to drive customer actions by delivering hyperpersonalization at scale.
How is cognitive computing different from AI?
Deloitte refers to cognitive computing as “more encompassing than the traditional, narrow view of AI.” AI has been primarily used to describe technologies capable of performing tasks normally requiring human intelligence, he says. “We see cognitive computing as being defined by machine intelligence, which is a collection of algorithmic capabilities that can augment employee performance, automate increasingly complex workloads, and develop cognitive agents that simulate both human thinking and engagement.”
2. The First Wave of Corporate AI Is Doomed to Fail by Kartik Hosanagar and Apoorv Saxena
Early efforts of companies developing chatbots for Facebook’s Messenger platform saw 70% failure rates in handling user requests. Yet a reversal on these initiatives among large companies would be a mistake. The potential of AI to transform industries truly is enormous. Recent research from McKinsey Global Institute found that 45% of work activities could potentially be automated by today’s technologies, and 80% of that is enabled by machine learning. The report also highlighted that companies across many sectors, such as manufacturing and health care, have captured less than 30% of the potential from their data and analytics investments. Early failures are often used to slow or completely end these investments.
For quick wins, one might focus on changing internal employee touchpoints, using recent advances in speech, vision, and language understanding. Examples of these projects might be a voice interface to help pharmacists look up substitute drugs, or a tool to schedule internal meetings. These are areas in which recently available, off-the-shelf AI tools, such as Google’s Cloud Speech API and Nuance’s speech recognition API, can be used, and they don’t require massive investment in training and hiring. (Disclosure: One of us is an executive at Alphabet Inc., the parent company of Google.) They will not be transformational, but they will help build consensus on the potential of AI. Such projects also help organizations gain experience with large-scale data gathering, processing, and labeling, skills that companies must have before embarking on more-ambitious AI projects.
For long-term projects, one might go beyond point optimization, to rethinking end-to-end processes, which is the area in which companies are likely to see the greatest impact. For example, an insurer could take a business process such as claims processing and automate it entirely, using speech and vision understanding. Allstate car insurance already allows users to take photos of auto damage and settle their claims on a mobile app. Technology that’s been trained on photos from past claims can accurately estimate the extent of the damage and automate the whole process. As companies such as Google have learned, building such high-value workflow automation requires not just off-the-shelf technology but also organizational skills in training machine learning algorithms.
3. How Companies Are Already Using AI by Satya Ramaswamy
Put AI to work on activities that have an immediate impact on revenue and cost.
Look for opportunities in which AI could help you produce more products with the same number of people you have today.
Start in the back office, not the front office.
4. How People Will Use AI to Do Their Jobs Better by H. James Wilson and Cyrille Bataller
Digital assistants you can trust
Software that makes experts better at what they do
Enabling innovation
5. The War on Experts by Gary Klein
Artificial Intelligence and Big Data have each claimed to be able to replace experts. However, each of these claims is unwarranted. Let’s start with AI. Smart systems should be able to do things like weather forecasting better (and more cheaply) than humans, but the statistics show that human forecasters improve the machine predictions by about 25%, an effect that has stayed constant over time. AI successes have been in games like chess, Go, and Jeopardy — games that are well-structured, with unambiguous referents and definitive correct solutions. But decision makers face wicked problems with unclear goals in ambiguous and dynamic situations, conditions that are beyond AI systems.
As Ben Shneiderman and I observed in a previous essay, humans are capable of frontier thinking, social engagement, and responsibility for actions. Big Data approaches can search through far more records and sensor inputs than any human, but these algorithms are susceptible to seeing patterns where none really exist. Google’s FluTrends project was publicized as a success story, but subsequently failed so badly that it was removed from use. Big Data algorithms follow historical trends, but may miss departures from these trends. Further, experts can use their expectancies to spot missing events that may be very important, but Big Data approaches are unaware of the absence of data and events.
6. Steer Clear of the Hype: 5 AI Myths by Christy Pettey
Buy an AI to solve your problems
Everyone needs an AI strategy or a chief AI officer
Artificial intelligence is real
AI technologies define their own goals
AI has human characteristics
7. Micro Explanations For Nine Essential AI Technologies by Mike Gualtieri
Knowledge engineering
Robotics
Speech recognition
Natural language processing
Natural language generation (NLG)
Image analysis
Machine learning
Deep learning
Sensory perception
8. 12 Ways AI Will Disrupt Your C-Suite by Lisa Morgan
Organizational Intelligence Explodes
First-Mover Advantage
Employees May Lead The Charge
Organizational Structures Will Shift
AI Requires Context
Organizations Have To Adapt
It's Not All About Technology
Employee Empowerment Is Necessary
Learn By Doing
Expect The Unexpected
Pay Attention To Possibilities
Change Is At Hand
Resources
1.LinkedIn Topics: Cognitive Computing - Artificial Intelligence
2. SlideShare: Cognitive Computing - Artificial Intelligence
3. Association for the Advancement of Artificial Intelligence
4. Cognitive Computing Consortium (archives)
What is an intelligent agent? by Frederick Mills and Robert Stufflebeam
Artificial Intelligence: Can a Machine Think? by David L. Anderson
6. (the late) Sue Feldman
Synthexis Blog (archives)
Cognitive Computing and Knowledge Management: Sparking Innovation - transcript of a video of Susan Feldman's keynote session at KMWorld 2015, and the full video of the session - Slides - Blog post
Cognitive Computing Consortium Blog (archives)
Cognitive computing: Big data and cognitive computing - Part 1
Cognitive computing: Big data and cognitive computing - Part 2
Cognitive computing: Beyond the hype: cognitive computing and your business, your job, your life
Lessons from the Cognitive Front Lines: Early Adopters of IBM’s Watson
Cognitive technologies all set to transform business processes
Learn to stop worrying and love the smart machines you’ll be working alongside in the future
8. Seth Earley
9. Adi Gaskell
10. IBM
11. APQC
Cognitive Computing and the Evolution of Knowledge Work (Collection)
Needs and Applications for Cognitive Computing: An Interview with Elmer Corbin
Cognitive Computing: Carla O’Dell on the Next Disruptive Force in KM
Deriving Knowledge Insights through Machine Learning at Deloitte
12. KMWorld
13. Rolling Stone: Inside the Artificial Intelligence Revolution: A Special Report
The Complete Beginner’s Guide To Chatbots by Matt Schlicht
How Bots Will Completely Kill Websites and Mobile Apps by Matt Schlicht
16. Information Week
AI, Cognitive Computing To Disrupt Enterprises: IDC by Jessica Davis
IBM: AI Should Stand For 'Augmented Intelligence' by Thomas Claburn
How Smart Is Artificial Intelligence Really? by Oliver Schabenberger
17. CNET: AI
Alexa, be more human by Ben Fox Rubin
The difference between Google Now and Google Assistant by Sarah Jacobsson Purewal
19. InfoWorld: AI
20. SAS: AI
21. Forbes: AI
What Is The Difference Between Deep Learning, Machine Learning and AI?
How Quantum Computers Will Revolutionize Artificial Intelligence, Machine Learning And Big Data
What Companies Are Winning The Race For Artificial Intelligence?
11 Tech Leaders Share The Real Truth About Artificial Intelligence (And What Really Matters)
These 5 Fields Will Be The First Hotbeds For Artificial Intelligence
22. Harvard Business Review: AI
What Artificial Intelligence Can and Can’t Do Right Now by Andrew Ng
In the AI Age, “Being Smart” Will Mean Something Completely Different by Ed Hess
How Artificial Intelligence Will Redefine Management by Vegard Kolbjørnsrud, Richard Amico, and Robert J. Thomas
8 Ways Machine Learning Is Improving Companies’ Work Processes by Dan Wellers, Timo Elliott, and Markus Noga
4 Models for Using AI to Make Decisions by Michael Schrage
Artificial Intelligence Is Almost Ready for Business by Brad Power
The Rise of AI Makes Emotional Intelligence More Important by Megan Beck and Barry Libert
To Get Consumers to Trust AI, Show Them Its Benefits by Ellen Enkel
23. MIT: AI
Augmentation Versus Automation: AI’s Utility in the Workplace by Allison Ryder
Putting AI to Work (video) Moderator: Joi Ito; Speakers: Ali Azarbayejani, Seth Earley, Ryan Gariepy, Josh Tenenbaum
Explained: Neural networks by Larry Hardesty
25. Gartner: AI
Gartner Says AI Technologies Will Be in Almost Every New Software Product by 2020
Applying Artificial Intelligence to Drive Business Transformation: A Gartner Trend Insight Report
26. Forrester: AI
27. Deloitte
Reconstructing work: Automation, artificial intelligence, and the essential role of humans
The rise of cognitive work (re)design: Applying cognitive tools to knowledge-based work
Time to move: From interest to adoption of cognitive technology
Cognitive computing: How AI-based capabilities are transforming business – infographic
Artificial intelligence and human-computer collaboration – webinar
28. David Schatsky
Why the hysteria about killer artificial intelligence is a good thing
How to create business value with cognitive technologies – video
29. Gary Klein
30. Tony Rhem - The Connection between Artificial Intelligence & Knowledge Management
31. Mary Abraham
32. Matt Moore
33. Dave Snowden
34. Luis Suarez
36. Futurism: AI
37. What deep learning really means by Martin Heller
38. The incomplete A-Z of cognitive computing by BBC
39. The Growing Importance of Natural Language Processing by Stephen F. DeAngelis
40. What’s the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning? by Michael Copeland
41. Our Machines Now Have Knowledge We’ll Never Understand by David Weinberger
42. The Definitive Guide to Natural Language Processing
43. A Basic Introduction To Neural Networks
44. 5 ways chatbots are revolutionizing knowledge management by Matt Wade
45. AI Use Cases That Will Advance the Industries in 2021 by Ivana Kotorchevikj
Software
1. Artificial Intelligence Software by Capterra
2. Top Cognitive Computing Companies by Venture Radar
3. Top Cognitive Computing Companies by Predictive Analytics Today
4. G2
5. Liam Hänel
Part 1 AI tools for Personal use
Part 2 AI tools for Businesses
Generative AI Tools
Training
2. Udacity
3. Deloitte
4. edX
5. Coursera
Books
Neural Networks and Deep Learning by Michael Nielsen
Deep Learning by Ian Goodfellow. Yoshua Bengio, and Aaron Courville
Cognitive Computing and Big Data Analytics by Judith Hurwitz and Marcia Kaufman
Cognitive Computing: A Brief Guide for Game Changers by Peter Fingar
Smart Machines: IBM's Watson and the Era of Cognitive Computing by John Kelly III and Steve Hamm
Big-Data Analytics for Cloud, IoT and Cognitive Computing by Kai Hwang and Min Chen
Cognitive Computing: Theory and Applications by Venu Govindaraju and Venkat N. Gudivada
Cognitive (Internet of) Things: Collaboration to Optimize Action by Arvind Sathi
The Sentient Machine: The Coming Age of Artificial Intelligence by Amir Husain
Applications of Cognitive Computing Systems and IBM Watson Edited by Aaditya Telang
Machine Learning for Decision Makers: Cognitive Computing Fundamentals for Better Decision Making by Patanjali Kashyap
The 2021–2026 World Outlook for Cognitive Computing by Philip M. Parker
Advances in Cognitive Informatics and Cognitive Computing by Yingxu Wang and Du Zhang
The AI-Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster, and More Profitable by Seth Earley
The AI Advantage: How to Put the Artificial Intelligence Revolution to Work by Tom Davenport
Video
with sharing and discusion to elavate the knowledge
5yGood article Stan Garfield about AI an Cognictive learning which are base on simulation model programming.Now the KM be radically change due the technologi of IT and also OT including analytic process.May the digital twin technology be a tool of knowledge capture and transfer? So the IICT and ITOT and neuro logic make it more radical change of KM and KM orocess.
Content Marketer | 2X Author | Wealth Management | Sustainable Growth
6yThanks Stan Garfield. Great post for people exploring the potential of AI and related technologies.