AI/machine learning

Artificial Intelligence Requires Left-Brain Thinking To Boost Reliability

How can AI systems incorporate processes mimicking the slower logic- and causality-based reasoning patterns of the left brain?

jpt-2017-10-wednesday-sd-dataanalytics-hero.jpg

The first night of the 2017 SPE Annual Technical Conference Exhibition (ATCE) featured a presentation that focused on the steps needed for artificial intelligence (AI) systems to become more reliable in oil and gas operations. Speaking at the Petroleum Data-Driven Analytics Technical Section dinner, Cycorp founder and chief executive officer Douglas Lenat spoke about the methods AI has taken and should take to incorporate automated slow-thought reasoning processes, thus helping to overcome an inherent brittleness problem with information retrieval.

Lenat said that the human brain has a number of fallback options when faced with situations that require the categorization of data points: It can rely on a continually expanding set of general knowledge or common sense; it can analogize to superficially far-flung situations that may bear some similarities; and it can infer solutions to problems based on reasoning.

Expert information retrieval software systems can behave in a similar manner; streaming media services such as Netflix and Hulu deduce preferences based on pattern recognition. However, Lenat said, these expert systems often have no model of what they do and do not know and they have little understanding of the context in which rules or facts are stated. He referred to this problem as the “brittleness bottleneck,” where software programs that traditionally represent documents as groups of words are restricted to learning individual word occurrences in a limited training set.

Lenat gave several examples of the brittleness bottleneck during his presentation, including one in which an expert system on skin disease diagnosis on which he had been working diagnosed a car as having measles after receiving information about its rusty condition. He said that this problem still pervades software today.

“What’s going on here is similar to a situation where the expert system is off the cliff. It’s out of its area of competence, but, because it can’t look down, because it doesn’t realize it’s out of its area of competence, it blindly goes ahead and answers a question,” Lenat said. “It’s funny, but, if it’s your life or livelihood on the line, then it’s obviously not so funny.” 

jpt-2017-10-wednesday-sd-dataanalytics-fig1-man-2.jpg
jpt-2017-10-wednesday-sd-dataanalytics-fig2-fish-2.jpg

"AI systems must also learn to incorporate the types of reasoning typically used by the “left brain,” which focuses on a slow analysis of logic and causality."

Lenat said that neural-net-based machine learning systems used to process big data often use patterns similarly to a human’s “right brain,” relying on deduction and pattern recognition. To help with the brittleness issue, he said these systems must also learn to incorporate the types of reasoning typically used by the “left brain,” which focuses on a slow analysis of logic and causality. The ability to rationalize these decisions could lead to significant developments in assisting oil and gas operations.

This incorporation of left-brain thinking requires the successful analysis of various logical and arithmetic combinations across information sources, which Lenat said requires tens of thousands of data points. A lot of information needs to be shared, such as special characters in an alphabet; morphological events; syntactic and semantic metalevel markups; and context, which could include several dimensions of metadata such as time and space.

“Imagine that you have all the correct taxonomic placements for all these terms, like ‘bed,’ ‘night,’ ‘person,’ sleep,’ ‘house,’ and so on. Just because you know taxonomically where you are doesn’t mean that you have the rules of thumb everybody expects to have about those concepts, all the things that we know that would enable us to say, ‘well, if something happens at 3 in the morning, let’s call this person at home rather than at work because they’re probably sleeping,’ ” he said.