AI Applications in the Mining Industry into the 21st Century

odecrackAI and Robotics

Oct 29, 2013 (5 years and 6 months ago)


AI Applications in the Mining Industry into the 21st Century

John A. Meech

University of British Columbia

Department of Mining and Mineral Process Engineering

Vancouver, British Columbia


Over the past 15 years, Artificial Intelligence techni
ques have shown promise of providing the ability to capture
"human intelligence" into a computer program. The sub
fields of Expert Systems and Artificial Neural Networks
are finding increased use in industry to perform tasks previously considered the domai
n of human thinking. But,
are these methods just a new type of software much like that of the past or is real

hidden within the lines
of code? Will this field continue to enjoy high growth rates and if so, what developments will we see in the near
ture? Which algorithms and methodologies will continue to grow?

This paper discusses a number of paradigms which can be used to create successful applications. There are
certain barriers; some psychological

some linked with knowledge acquisition, that
must be overcome or
avoided to ensure system performance. Development time also requires careful consideration before embarking
on an AI project. The paper focuses on constructing goal
driven systems that transfer high
solutions from the researc
h laboratory to the plant floor.

Respect for system users is of paramount importance

whether their involvement is to access contained
information or to participate in program maintenance as new knowledge becomes available. Examples of
successful and un
successful AI implementation in Canadian industry and academia provide evidence of the
importance of this issue. The paper also examines some current constraints on applying AI in the real
Among these problems are automated knowledge acquisition, me
thods to acquire data during run
time, real
systems (hardware versus software), explanation and justification features and adaptive/learning systems.

presented at


Brisbane, Australia

July 1995.

AI Applications in the Mining
Industry into the 21st Century


An Expert is someone who has already made all of

the possible mistakes in a very narrow field of study


Niels Bohr

Artificial Intelligence has experienced rapid growth over the past decade. Numerous Expert S
ystem development
tools entered the market
place in parallel with the explosion of microcomputers and there are many examples of
successful applications. Processing plants worldwide, now use AI to solve real
world problems.

methodologies have bee
n distilled into rule
based environments with techniques that link incoming data to sub
goals and then to final conclusions. Frame
based approaches have evolved into object
oriented methods that
group information into classes allowing construction of effic
ient rule
structures in terms of both reduced
numbers of rules and processing time.

But, as more and more technical people have become familiar with these methods, it is apparent there are
limitations to the ease with which applications are developed. It

takes considerable skill in the following areas to
ensure success: psychology, computer programming, knowledge acquisition and representation, etc. Without
training, it is difficult to build useful and productive systems

ones with big payback!

Expert S
ystems have promised to embody "human intelligence" into a computer program. This, they do well and
a system can function with only a single rule in its knowledge base making development relatively painless. Such

systems have limited ability and knowledge
but not all. There are techniques today to incorporate expertise into
a Hypertext Interface so Users control and manipulate the domain as they desire.

Early systems were considered small or toy
like unless they contained hundreds or even, thousands of ru
Today, a system containing this number of rules is either an extremely large application with extensive domain
specific knowledge and perhaps, some global
knowledge capability, or it has been built using old techniques.
intelligent met
hods such as Fuzzy Logic, Artificial Neural Networks and Evolutionary
Computing significantly enhance the ability to represent knowledge effectively and efficiently.

This paper describes the features of AI techniques that can be used to create productive
systems. We will begin
by examining components of human intelligence that are currently well
modeled. The approach to building a
successful system is then presented. The drawbacks of existing methods and the attempts to address these
issues are discussed
together with the introduction of a new AI field

Computational Intelligence
. Several
successful systems are used throughout the text to illustrate the methods indicated.


Knowledge comes, but wisdom lingers


Lord Tennyso

To understand how we can simulate the human
thought process, it is necessary to examine some components of
our thinking ability. A partial list might include the following items:


processing symbols;


solving problems using a variety of techniques;


mathematical processing


empirical regression analysis


heuristic modeling


explaining and justifying;


handling uncertainty;


learning and adapting;


practicing introspection and invention;


being creative and designing;


ng and summarizing;


using multiple sensors (seeing, hearing, smelling, tasting, touching);


generating multiple output (speech, writing, drawing, hand
signals, etc.).


Current Expert Systems handle some of these items but are poor at mathematical probl
solving (too
they cannot learn and are not usually adaptable; generalization and summarization skills are essentially non
existent. In real
time, when properly configured, many measurement
reaction situations requiring several
seconds turnaround
time can be handled, but faster times are still unavailable. Speed issues will eventually be
addressed with more powerful hardware such as parallel
processing computers but this is several years away.

Multiple output is essential for these systems but no
t necessarily in the way humans communicate. A User
Interface should provide graphical data presentation that is easy to comprehend. Access to background data
should be available. Most AI tools today can communicate with other software, either directly thr
ough RAM,
stored data files or across a series of networked computers (the first use of parallel
processing devices).

Data entry by keyboard and mouse are the best way to communicate with an Expert System. Audio alarms are
useful to draw attention to

key process upsets but voice
activated messages and voice
recognition capability
are still not advanced enough. In a few years however, this situation is likely to change. Already, there is
software on the market that provides dictation ability and voice
entry. Some plants have tried on
audio/visual help menus but these quickly become repetitious and are only beneficial the first few times they are
used. Development time is too long to justify the small rewards. Such multi
media features are t
oo static and
require too much storage space to be useful in the long run

The key area of success with this technology is the ability to rapidly model a process with almost the exact
terminology used by an Expert. The features of knowledge represent
ation in a Development Tool can be learned
quickly but the time to acquire knowledge is not short as the system must be built one step at a time

each jewel of wisdom bit by bit.


The clarity of knowledge is inv
ersely proportional to the number of Experts

Philippe Poirier

Expert Systems possess the ability to embody "intelligence" into a computer program in much the way

takes place. Facts are stored symbolically with their truth or falsity

dependent upon current
circumstances while using a system. A Knowledge Engineer can build a system using the exact words of the
Expert expressed as symbols and rules
thumb that link these symbols. For example, an Operator might express
the following id

"When froth conditions on the lead rougher bank are porridge
popping, I reduce the collector
addition rate by at least 100 cc/min unless it is already at minimum. In that case, I increase the
water addition to the rougher feed provided the Pb feed gra
de is not high. If it is high, then I
reduce the feed tonnage."

This simple paragraph can be formulated into the following code:






collector.addition_rate.minimum is FALSE

dition_rate_change.Negative_Big is TRUE


MACRO ("Water_addition_increase")








feed.Pb_grade.high is FALSE


feed.water_addition.Increase is



feed.tonnage_rate.Decrease is TRUE

Fuzzy Sets will back
up these rules to map discrete measurements into the linguistic terms in the rule statements.
Explanations can be attached to each fact and rule descriptions to each rule as desired to pr
ovide Users with the
ability to probe the knowledge base for further details should these rules fire successfully.


These rules and others are added to the system incrementally with static and dynamic testing of the knowledge
base at various points in time
as the system grows. Other "experts" might be polled to ensure there is general
agreement about the strategy to be implemented in the system. If conflict arises, a method to arbitrate
disagreement must be setup, usually with the support of suitable metallu
rgical staff.

During a consultation with an Expert System, a User can obtain justification for the advice given or ask the
system about the meaning of certain dialog: "Why are you asking me for this information?" or "What does that
term mean?" or "How did

you arrive at your conclusion?" In this way, such systems serve an important training
function in addition to fulfilling their original goal: diagnosis, monitoring or control.

While this provides a dialog which mimics how a real "Expert" might communicat
e expertise to a novice or lesser
expert, there is more to human intelligence than providing advice and explanations. For example, these systems
cannot learn from experience. Neither can they devise new knowledge from an examination of their current
dge. They simply perform what a human has created them to do. So based on today's techniques, Expert
Systems appear to have reached a watershed.


Belief is like a guillotine

just as heavy and just as light


Franz Kaf

Domain Selection

The choice of an appropriate domain to begin applying AI is not always clear. Often, a careful organization
selects an application that is easy to implement in order to demonstrate to skeptical individuals that the
technology does wor
k. A consultative system is generally the preferred starting point with development of an
line advising system. Once the technology is proven, other applications should grow naturally out of this
initial success. While this
conservative approach

is lik
ely to overcome resistance to AI, it is a slow route and
does not produce large rewards quickly.

An alternate approach considered more risky and certainly more costly, is to begin with an on
line real
monitoring or control system. These are
ack applications

which will pay for their initial cost in several
weeks or months. The initial prototype is developed fast. Select a well
understood problem area; one that is
difficult to automate but has existing sensor data. Build a system with a useful
and user
friendly interface so
operating personnel adopt the system as an essential tool in their work.

line systems can be prototyped in as little as 2 weeks by experienced personnel. With close cooperation of
Management and Users, these systems produ
ce major benefits

in as little as one month. The process begins by
rapid development of the knowledge base and User Interface. Once tested in a static mode, the system is hooked
into the plant DCS database or PLC system to operate in a monitoring mode in
parallel with existing manual
control. System advice or diagnostics are evaluated by the Users together with the development team and
modified to suit the objectives. Once all parties agree, the system is turned on to control mode to supervise
existing con
trollers with set point changes.

Development Time

The time required to develop an Expert System is a function of a number of variables:


time to learn the tool;


time to learn the techniques;


time to acquire domain knowledge.

The first t
wo items can be met with a good training program and acceptance of the technology by a variety of
levels in an organization. Commitment to the project and the technology are essential to initiate successful
development. However, while learning the tool and

techniques can be accomplished by interested people in a
very short time (2
3 days), the time to apply these methods to a knowledge domain can be considerable. Mastery
of these techniques is a life
time commitment as most tools today possess a multitude o
f approaches to
constructing and organizing a knowledge base

meaning there is always a better way to build a system.

The biggest roadblock to success is acquisition of an Expert's knowledge. Our experience indicates that a good
approach is to use someon
e other than the Expert for development. The "Knowledge Engineer"
should not


the domain Expert, but must be motivated to become one and have more than a passing interest in the subject.
Development occurs using an interview process between the Knowledge

Engineer and the Expert. Experts often
have difficulty describing stepwise how they apply their knowledge to make decisions. Analysis of situations
and cases can often be frustrating. A Knowledge Engineer must be patient and prepared to ask "silly" questi
that open up new paths ih the domain moving from input data through to final conclusions.

A sensible framework

for questions to pose during the interview process might be:

Let's select a problem for consideration?

How is this problem recogniz
ed ?

How is this problem solved ?

How do you verify the steps?

knowledge such as explanations and rule descriptions can also be important to establish future avenues to
be opened up in the future. Interviews should be scheduled and of short durat
ion (1
1.5 hours). Interruptions
should be avoided and all parties should respect each other and have commitment to reach the planned goals.

System Maintenance

Once installed, continued development and expansion of the knowledge base is important. New kn
owledge may
become available during construction or during early use of the installed prototype. As the system is used,
knowledge may require modification with deletion of certain concepts and addition of new features.

To accomplish this task, a "champion
" of the project within the organization is needed to monitor progress and
adjust the knowledge base when unforeseen occurrences arise. Training this person in the "tricks" and
"techniques" of the tool is essential to continue the process of transferring A
I methods into the organization.

Learning Techniques and Adaptation

Current Expert Systems do not possess true learning capability. They are unable to adjust their expertise to
account for error or new happenings. They cannot be introspective and examine

what they currently know in
order to propose new relationships or ideas. However, there are a number of Development Tools that possess
techniques for dealing with uncertainty that allow a system to adapt its rule structure as circumstances change
adding new lines of code. Once an input/output map has been constructed linking certain inputs to one
or more outputs, it is possible to incorporate a new variable into the model without significantly adjusting the
code. The methodology that permits this a
daptation is called Fuzzy Logic.

An example of fuzzy set adaptation is as follows: Suppose we map the variable

a linguistic expression such as

as below:

Fuzzy Set: addition_low

Source: floccu







But suppose the slimes content of the slurry becomes very
high. Under this circumstance, a flocculant addition
rate of 30 ml/min is "
not low
" any longer. Instead we find as much as
60 ml/min might be considered

To accommodate this adaptation, our fuzzy set can be easily changed to:

Fuzzy Set: addition_low

Source: flocculant.addition_rate.@f

0.03 * CERTAINTY(feed.slimes_content.very_high)







So belief in a

flocculant addition as a function of the actual addition rate becomes a fuzzy concept with
respect to the slimes content. When the slimes content is definitely
very high

the fuzzy set shifts to full belief at
30 and
full disbelief at 60. But when the slimes are definitely
not high
, the original definition applies.


There are many ways to adapt "fuzzy" linguistic concepts. Any mathematical relationship can be handled
through modification of the Fuzzy Set source. Adaptat
ion is an important way to expand an existing system
without disrupting the rule structure or changing the original logic in the system. Although not strictly a
"learning" technique, the relationship originally established between input and output variable
s is adjusted in
response to changes in variables external to the original rule structure.


There is nothing more difficult to carry out, nor more doubtful of success, nor more dangerous to
handle than to initiate a new order of th

Machiavelli (circa. 1590)

Implementation of an Expert System is perhaps the most difficult stage in the development cycle since it
involves so many different people and ideas. Success demands excellent interpersonal skills with a little dab of
chology and problem anticipation thrown
in for good measure. We are talking here about introducing a
system into a large organizational network or into a plant to conduct process monitoring and control. Some of
the important issues to be addressed during i
mplementation are as follows:



Legal Issues (can we be sued for the advice given?)


Union Concerns (is labour being replaced?)


Social Issues (man
machine interface design)


Economic Situation (do we need this technolog
y to survive?)


Political Issues (Freedom of Information vs. Right to Privacy)


Obtaining data involves decisions about sources, sensors and time. Data problems can railroad a system very
quickly. For real
time applications, the speed of p
rocessing information must be considered. Data pre
is needed to ensure rapid reaction. This requires intelligent drivers to interface with the System and Process
Sensors to filter data in a "smart" way to produce symbolic meaning for the system'
s knowledge base.



Involvement and Commitment of Top Management


User and Management Commitment (must be sustained)


Institutionalization of the System (must become essential component)


Individual Innovation and
Creativity (encourage User involvement)


Expectations of the Organization


Perceived Need for the System


Benefit Analysis (should be performed early in the process)


Operations and Resources provided


Financing Arrangements (fixed ver
sus open
ended price)


Timing and Priority Issues (early prototype implementation can be helpful)



level of complexity


adaptability of system


response time and reliability


availability and access of key personnel


ack of equipment or instruments


need to standardize knowledge and/or practices


networking difficulties or software/hardware mismatch


Interpersonal rivalries can prevent success. If the Developer is responsible for implementation, s
persuasion skills are needed to work with Management and Users. If Management or Users are to conduct
implementation, excellent communication is necessary to ensure the project needs are addressed by the
Developer. If only the User Group is responsi
ble for implementation, then the initial stages of the project are
extremely important to ensure proper training is given about installation and maintenance. The Development and


Implementation cycles must be continuous and carefully planned if all parties

are not involved in installation.
An external consultant is often useful to begin the introduction of AI into an organization.

When conditions that promote successful implementation are not addressed, some Users and/or Management
personnel may develop t
actics for "counter
implementation" or sabotage.

These include the following:


diversion of resources from the project (money or people tactics)


deflection of goals (wandering tactics)


dissipation of enthusiasm for project (dumping tactics)


glecting the project (delay tactics)


making errors on purpose ("try to fool" tactics)


using system inappropriately ( misuse tactics)


failing to use the system (ignore tactics)


relying on old methods ("no
change here" tactics)

Changing goals i
s the simplest way to sideline a project, so be sure that all parties agree on clear and concise
objectives at system initiation. Whenever this tactic is tried, redirect attention to the initial goals document and
show how the changed goal will impede time
ly completion within budget. There are many ways to address such
tactics but it will be a difficult up
hill battle without support from Management. Remember

resistance to change
is a normal "human" feeling about new ideas that derives naturally from one
or more of the following:


individual or departmental rivalries (power transfer to another group)


fear of obsolescence or redundancy (loss of esteem or loss of job)


group cohesiveness (rejection of "outsiders")


cultural practices (system fi
t with existing way of doing things)


increased workload (initial reaction to new methods)


information may be jealously guarded (my knowledge belongs to me)


invasion of privacy ( "Big
Brother" will find out how I use the system)


fear of the
unknown (I hate computers)

The key is User commitment. Here are some ways to get this involvement:


Users suggest that the system is needed;


Users participate in the preliminary discussions;


Users obtain the required data and knowledge;


Users suggest changes to the growing system;


Users help design User Interface;


Users "smooth the way" to keep plan moving along.

Lack of commitment is a major cause of system rejection. Users must believe ES technology is useful and can
increase k
nowledge. Often, thoughtless postures are taken: "
I didn't try it because I didn't like it
". One example
of failure caused by this issue is the Polaris Expert


the first Expert System for real
time supervisory control in
the Mining Industry.

150 km from the North Pole, the Polaris mine appeared a perfect place to introduce this technology to
standardize operating practices, stabilize circuit conditions, provide training for inexperienced operators (turn
over was high at Polaris), and guide pla
nt operators. The system developed in concert with the operators over a
one year period. Good performance is claimed but the system needed regular maintenance. Failure occurred
because the "champion" left the company and commitment to maintain the system w
as low. The system worked
well as long as a "Baby
Sitter" was available. It seems strange that a company would spend large sums of money
on a project and then let it die because no one was available to keep it going.

Political problems are also sources of

failures. Sometimes, union personnel do not understand that consultative
ES are useful for training novices and are not aimed at worker replacement. This creates a very uncomfortable
situation and an ES user can become a traitor to his peers. When attempt
ing to capture knowledge, any
suspicious attitude can destroy the trust that must exist between operators and knowledge engineer.


The flotation advisory system built at Highland Valley Copper, Logan Lake, BC, exemplifies the reluctance of
older operators
to share their expertise. Some had little faith in the ES technology for training younger operators
and refused to contribute to the project. When the knowledge was finally obtained and the system implemented,
operators responded positively to the n
ew ideas and actions provided.

To attack resistance to change, three stages can be used by system developer and users:

: To create an awareness of the need for change at the organizational level or for individual workers,
incentives are a powerfu
l persuasion technique. These can include: increased bonus for high
productivity, decreased "hard" work / increased "smart" work, better working conditions, more free
to perform routine or lower
priority jobs, etc.

: Next, forces that impede int
roducing change are identified, diminished in intensity or redirected to
develop new methods, learning attitudes and behavior. The style of implementation will be important.
User involvement in construction is best at this stage. Don't force the system ont
o a hostile group.
Introduce training sessions early.

: The final step reinforces the change to maintain and stabilize the situation at a new level.
Allow Users to evaluate and test the system

their ideas must be listened to and considered. S
et up Daily
Report sheets to be filled
in. These documents should be easy to draft with minimum effort and time.
Users should believe the system is an extension of themselves and they are the custodians of the
knowledge being utilized. If the system become
s essential to the new routine, acceptance is assured.


Rakocevic et al

consider the following definition for a real
time system:

An integrated computer program environment that responds fast enough to interrupting external
events t
o provide accurate and fast control (or alarm) action.

At the lowest level in the process control hierarchy

Level 0, instruments monitor, sense and manipulate plant
variables. These devices are connected to control units

Level 1, consisting of Program
mable Logic Controllers
(PLCs), single
loop controllers or Distributed Control Systems (DCS). Collection, manipulation and presentation
of sensor data are based on numerical methods. At this level, a control system responds extremely fast to
external event
s, carrying out several activities simultaneously. To provide such speed on a computer, a real
tasking Operating System is a necessity. Most AI applications for real
time control are implemented at the
Supervisory stage

Level 2, although there

are a number of examples of non
time applications at higher
levels for diagnosis and advice

Wide and Total Enterprise Levels.

Figure 1.


A Real
Time Multi
Tasking Intelligent SCADA

The ProcessVision software package depicted in Figure 1 is an Expert System
based SCADA tool for
Supervisory Control.

time through the Knowledge Base is typically several seconds. Rakocevic et al

refers to this time scale as "pseudo" rea
time. Similar to a real
time system, "pseudo" real time provides fast and
accurate control, but in a different time domain.


AI will have increased success with development of parallel
processors and faster computers. More powerful
CPUs within multi
ssing environments can solve many of the speed issues allowing conventional AI to be
used in real
time. Meanwhile, instead of waiting for the big event, people are studying ways to apply AI in real
time using software approaches. Hardware does exists to so
lve some AI difficulties, but data acquisition boards
with on
board CPU chips and/or programmable EPROMs are not essential. We can provide these functions
today using software on high
speed single CPU machines (486
100 or Pentiums)



Mathematics is the science which draws necessary conclusions

Benjamin Pierce, 1881

Processing symbols on conventional "von Neumann"
type computers is inherently inefficient because of
hardware architecture. Successful AI

applications focus directly on manipulating symbols. So how can AI move
effectively into an environment (real
time, direct control, etc.) that demands intensive and rapid numerical
computing? Part of the problem relates to combinatorial complexity. As the

number of inputs and outputs in a
system increases, difficulty in dealing with all combinations of these occurrences increases exponentially.

Symbolic methods, while useful at providing explanations and access to the inner workings of a system, are
y and onerous to develop and maintain but algorithms are available to support the high overhead of
symbolic processing on a microcomputer. Without knowledge of these methods, ES applications will remain
trivial, slow to develop and difficult to maintain.

The answer lies in a set of paradigms called "Computational Intelligence" that have evolved out of AI and other
related disciplines. Computational Intelligence (CI) is a term coined by Bezdek in 1993 to describe
"low level"
knowledge in the style of the mi

CI consists of very "primitive" concepts in the AI sense, that support the
beginnings of Symbolic Knowledge which Bezdek called "tid
bits". These "elements" can become the inputs to
an AI structure that processes symbols heuristically or in other wa
ys. Primitives are fundamental operations
conducted on numbers:







These fundamental operations are the primitive steps in any complex numerical structure that produce
elementary sym
bols for AI processing. Current hardware can deal with these numerical manipulations efficiently
suggesting that CI can become the fundamental support structure for conventional AI methodologies. The
components of CI may include Fuzzy Logic mappings, Artif
icial Neural Network connections and/or Genetic
Algorithm optimizations

all of which depend on numerical processing to support the generation of symbolic

This definition seems limiting in that Computational Intelligence should not rely only on p
ure mathematics. For
true intelligence, CI should include AI methods such as heuristics to allow rapid computations, even when gross
error may be contained in the output.

To bring AI into the control hierarchy of a real
time environment, a CI module that
creates "primitive" symbols
has to be very fast (~ millisecond processing time).

The key trade
off in all real
time computing systems is
always accuracy versus processing speed. Incorporating AI methods into a CI module introduces certain error,
but prod
uces appreciable speed. With proper setup, direct connection between the CI and AI levels can assist
with error detection and interpretation at the AI level. The AI module can test symbolic outputs from co
operating sensors, and tune out error using a feed
back loop to the CI module as in Figure 2.


Figure 2.

Error Detection in the AI module to adapt the CI module.

Borrowing an analogy from Bezdek,

an individual given an apple can immediately recognize its shape and
ur. With eyes closed however, other senses must be used: touch, smell and taste. Smell and taste produce
time decisions about the apple but touch and feel may require additional processing to visualize mentally an
apple's shape and texture. Such symbo
ls are recalled and compared with measurements less accurate than sight.
So too can CI assist AI to make rapid decisions intelligently, monitor the selection and correct it if necessary.

To do this effectively, CI need the following items:


les (inferences and relationships)


prior knowledge (to direct the CI process)


symbolic "primitives" (output from the CI module)

The approach is similar to the hierarchy of human intelligence depicted in Figure 3. Biological Intelligence

of manipulating symbols supported by low
level numerical processing to generate belief in a particular
symbol. This hierarchy is mirrored in the arrangement of AI with Computational Intelligence forming the
foundation for rapid problem

The boun
dary between AI and CI is 'fuzzy', with Fuzzy Logic, Artificial Neural Networks and Genetic Algorithms
playing cross
over roles. In the future, the boundary between Biological and Artificial Intelligence will become
'fuzzy' as robots indistinguishable from

humans become common. Machines will have chemically
processors; people will have electronic
thinking replacement parts but this is well in the future.

If we apply crisp concepts to define the components of intelligence, we find 3 distinct levels:
Intelligence (BI) or Artificial Intelligence (AI), Symbolic Intelligence and Computational Intelligence (CI). In a
similar fashion to man
made intelligent machines, we also find different types of Biological Intelligence
(otherwise we might not
eat beef!). Humans are at the highest level

the "sentient" specie. More primitive forms
of life are less intelligent and yet perform complex tasks effectively and efficiently.


Figure 3.
The Similarities of Biological an
d Artificial Intelligence

Much AI work has attempted to imitate the minds of certain organisms. For example, DARPA set a goal in 1987
to mimic the intelligence of the honey bee by the year 1992.

This requires 10

connections in an artificial neural
We do not know if the project succeeded, but clearly, this work is moving us toward Artificial Life and
Virtual Reality. So, is there a boundary between AI and BI, between a true reality and a virtual one? CI is an
important element of this boundary.

in Biological Intelligence, the ability of autistic savants to carry out rapid and accurate data calculations,
musical recall, etc., are examples of how the human brain can perform accurate real
time computation. Whether
output is intelligent or not is det
ermined by those who interact with such exceptional people. Perhaps they use
fuzzy set definitions with very broad support characteristics (see Figure 4). Such curves provide rapid
interpolation, but sensible output from an individual set is not clear. Whi
le savants can tell the day of the week
for a particular date, they are unable to understand the significance of the date in question.

Figure 4
. Examples of Fuzzy Set definitions for normal and autistic humans.



When the only tool you have is a hammer, all your problems look like nails


Lotfi Zadeh

Fuzzy Expert Systems use Fuzzy Sets and Fuzzy Logic to perform qualitative modelling and/or control processes
when no model has been identified or developed fr
om first principles. Fuzzy Logic allows us to generalize across
a space
map by interpolating from rule node to rule node. While the answer is an approximation of the "correct"
one, it is usually acceptable. Gains in search and storage efficiencies are rema

Fuzzy Systems are adaptable, in that mapping of fuzzy concepts onto a Universe of Discourse can change in
response to external forces. The context of the Fuzzy I/O Map (called a Fuzzy Associative Memory or FAM) can
be taken into account so that th
e term "tall" for example, can mean one thing to a pygmy and another to a giant.

A Fuzzy Logic Controller

operates with a series of rules that link input linguistic expressions to output
expressions. The rules combine inputs using AND and OR operators.
An Inference Method is used to determine
the Degree of Belief in each output expression. Typically, the Minimum degree of belief is selected from ANDed
facts and the Maximum degree of belief from ORed facts. This generates a Net Degree of Truth in each out
expression which is then multiplied as a fraction by the Fuzzy Set definition for each output. To calculate a
discrete output value, the individual output Fuzzy Sets are combined into a single distribution function using a
average approach. Th
is function is then defuzzified by taking the centroid position. Smith

delineated over 90 different Inference
Defuzzification options and has shown the advantage of switching
methods dynamically when circumstances change.

Alternatively, each singl
e output Fuzzy Set can be "linguistically" defuzzified

into a variety of expressions
based on the degree of belief in the concept. Adaptation of these expressions is possible by using an "alpha" or
acceptance factor. This permits generation of a multi
riable FAM that generates variable "fuzzy" output
expressions. Figure 5 shows an example of how the "acceptance" factor can be determined from three types of

in this case Economic, Technical and Social
Political variables are combined to gener

NOTE: 0 = rejection 100 = acceptance

Figure 5.

A 3
dimensional Fuzzy Associative Map used to calculate an Acceptance

Factor (

) to adapt output from a Fuzzy Logic Controller.


h variable grouping derives from many external variables that may use other FAM rules. This particular
example was devised to accommodate different perceptions of the intensity of man
made mercury emissions in
an Expert System designed to diagnose Hg
tion problems in the Amazon.

It was considered that behavior
of workers depends on social incentives and reactions.

As a result, developments in a society change our
view of high and low levels of Hg emissions. A high


indicates acceptance of Hg use and low control of
emissions by a society, country or region. In the Amazon,


is 1.0, while in Canada, where Hg is practically
banned and well
monitored by authorities, it is much lower (0.1 or 0.01). When the hazard
ous effects of Hg were
unknown and thousand of miners were colonizing the West 150 years ago,


would be much higher (10




= degree of importance of input variable


= degree of belief in input variable i

Figure 6.

Linguistic output of DoB in High Emission Factor showing its elasticity with a


A term called "High Emission Factor" is used to represent the collection of observations about a mining
on that derive belief in high emissions. Belief in HEF is linguistically defuzzified according to the
relationship depicted in Figure 6. Note how the term used to describe Hg emissions is dependent on the value of

. When


is high, indicating acceptance of Hg use, to conclude that emissions are "extremely

belief in HEF must also be very
high (>98%). When


is very
low indicating rejection of Hg use, "high"
emissions derive from belief in HEF as low
as 7 %. Elasticity with this technique is substantial.


When something is new, they say "it's not true."

When its truth becomes obvious, they say, 'it's not important."

When its importance cannot be denied, they reason, "it's no
t new."


William James

The area of Computational Intelligence offering true learning capability is that of Artificial Neural Networks. The
paradigms in this field are based on direct modelling of the human neuronal system and hence, researchers
this area are often referred to as "Connectionists".


Basically, an I/O space map is modeled as a series of interacting nodes in a network. When a node fires (taking
on a signal value > 0), the signal is amplified by multiplying by a "weight" assigned

to each connection between
this node and others. At the receiving node, all incoming signals are summed to give an overall force acting on
the neuron. This sum is passed through an activation function to set a signal strength from 0 to 1 for the neuron.
his signal is sent to other nodes to which this node is connected. The process continues with a new node until
all nodes are exhausted and the signal strength of all outputs are known.

The full power of this method will only be exploited when parallel
cessing computers become more widely
available so that many neurons in the network are dealt with at once. Nevertheless, implementation of these
systems with large numbers of connections (hundreds) perform extremely well on today's high

Figure 7 shows the structure of a n x n x n 3
layer network with bias.

Figure 7.

Structure of a 3
Layer Artificial Neural Network with Bias.

The advantage of neural networks is their ability to learn or ad
apt to changing circumstances. Learning begins
by selecting a set of random weights and presenting the network with a set of known I/O values from a large set
of training data. The network is fired and predicted outputs are calculated. These are compared w
ith the desired
output and the error passed back through the network by using a regression or gradient
descent technique. New
weights are derived and the network examines a new set of I/O data. This process continues until the overall
error is within a pre
defined tolerance or until a selected number of iterations have executed. This is known as
Supervised Learning

Propagation being the most widely
used technique.

The trained network is put into service to deal with the particular environment it wa
s designed to handle. Its
performance is monitored on a regular basis by examining the accuracy of its predictions. Should drift occur due
to changing external forces, the network can be removed temporarily from service to be retrained on more up
date d

There are a number of ways in which a network can be configured

the Perceptron model, the 3
layer Back
Propagation model, Kohonen networks with unsupervised learning, CMAC or Cerebellum Model Articulation
Control, Holographic networks, etc. Each h
as its own unique advantages and disadvantages. The Perceptron
model for instance, cannot deal with problems that have non
separable space mappings. Kohonen networks are
used to classify input patterns rather than for process control, etc. Back

methods take considerable
time to learn and can get trapped in local
minima regions.


AND Corporation of Hamilton, Ontario has developed a novel approach to neural modeling very different from
conventional techniques. Instead of training a network by fee
ding one set of I/O data at a time, all data is
analysed together to produce an I/O map that directly links input and output data. The method is not a
connectionist model. Instead, complex number arithmetic is used to perform regression analysis. The phase


represents the actual value of an item, while the amplitude of the vector represents its Degree of Belief.

Holographic systems

require only single cells to learn stimulus
response associations. These systems can
actually learn in real time unli
ke connectionist models which require considerable time to learn an I/O map.
Holographic nets map all input/output data on one pass directly into the structure of the neuron "cell". Speed of
learning is orders
magnitude greater than existing back
ation techniques.

Holography is not simply retrieval of patterns. All I/O patterns are actually superimposed onto the same set of
synapses. The model has the advantage of conventional network accuracy but at only a fraction of the memory
and processing ti
me to perform a similar pattern
ID task. HNet generalizes by performing interpolation among the
trained I/O mappings much as occurs in a Fuzzy Logic controller. Similar to the numerous Defuzzification and
Inferencing options available, holographic networks

allow a User to modify the generalization characteristics so
that mappings overlap as interpolation is performed.


Some call it Evolution and others call it God

William H. Carruth

There has been much exc
itement in recent years in the study of Genetic Algorithms

a concept based on
Darwin's criterion of "Survival of the Fittest". Genetic Algorithms and/or Genetic Computing are methods that
automate the search for a better solution. The process involves re
presenting solutions by a string of numbers
and "mating" them two at a time. The "child" of each simulated "procreation" has better attributes than either

The method is an adaptive parallel
search strategy to locate a global optimum point without

becoming trapped
within local minima positions. The algorithm uses the following operations:


a chromosomal representation of problem solutions;


a method to create an initial population of possible solutions;


an objective function that rates solu
tions in terms of "fitness";


genetic operators to alter solution composition during reproduction.

GA systems analyse and evaluate each member of a population of solutions ranking them on a "best
fit" criteria.
A number of genetic operators are applied
to the population to create a set of new solutions, which are then
explored. The process continues until all members converge to a stable position. Three operations used, are
reproduction, cross
over and mutation. This latter provides a periodic push to fi
nd a new and better solution.


To develop learning capability, Expert Systems (whether Fuzzy or not) need to be able to interact with a number
of these important algorithms. In the case of the connectionist models, the
weights of the network can be
converted into rules for incorporation into a Knowledge Base. By examining the weights in a scaled fashion, it is
possible to formulate symbolic meaning for the hidden neurons within the structure of the network. In this way
he network can be adapted to provide explanations to a User rather than remaining a simple "black

Learning can be accelerated by incorporating a Holographic approach to calculate the weights of the
connectionist network. Although direct application
of the holographic network is likely the fastest and most
efficient process, it too suffers from being a "black
box". The Expert System which polices and controls all
learning systems thus, becomes the reservoir for the intelligent symbolic representation
of the knowledge
domain with its symbols derived from the neural methods which learned from the training data sets.

Development Of Computationally
Intelligent Real
Time Systems


For proper organization, it is necessary to divide the important processing f
unctions into separate modules as
shown in Figure 1. Each module coordinates individual tasks such as: data storage, message transfer, alarm
sensing, event scheduling, process/user communication, knowledge processing, explaining, etc. The modules
with each other, the user and the process. As the system grows in size and complexity, availability and
around of resources decline until eventually, additional hardware is required (more memory or more CPU).

To delay onset of such problems for an ex
panding system or to design for maximum efficiency with currently
available resources, intelligent computation must be included in the system at the initial development stage.
Traditionally, intelligent SCADA systems have been applied for supervisory contr
ol service only. Typical cycle
times for knowledge base processing average in seconds. Typical data up
date times are 1 second to ensure
efficient utilization of system resources. For some applications however, this is too long a delay. Certainly, for
ct regulatory control, this sampling interval is inappropriate. Also for rapidly
changing variables that indicate
important process transformations, much faster measurements are required.

To apply a rule
base system to batch data such as on
stream XRF ass
ays, temperature profiles of steel billets

or thickness measurements of paper on a roll from a paper mill, it is necessary to sample at much faster rates.
Suppose the temperature of a billet surface is to be recorded as it passes in front of an optical p
yrometer. The
duration of measurement is relatively short (~ 2 seconds) but the intensity of data output is substantial (perhaps
as high as 2000 points). The time interval must drop to 1 millisecond to accomplish this task.

Storage of such data (much of w
hich is redundant) within a conventional SCADA database is inefficient. RAM
requirements are exorbitant and update delays are obvious. A means to apply pre
filtering is necessary. A
special data acquisition board is used with on
board RAM and in some cases

board CPU. These boards
range in price from 2000 to 5000 dollars. Up to 10,000 points per channel with up to 16 channels can be acquired
on a board with 4Mb of RAM. This data must be filtered using an intelligent data communication driver.
rabyte's DAS
20 board is an example of such hardware.

The driver is a C
code program for communication between the board and RAM
resident datafiles. Functions to
interpret shape and trend features in the data prepare symbols for the AI knowledge base. A
ll functions and
channels are configured using an ASCII initialization file. Heuristics ensure rapid computation of the required
features and filtered data.


A conclusion is something I believe to be true. Its proof is for you to judge


cello Veiga

AI techniques have become important computing tools to develop applications based on mental models of
processes and domain knowledge. The efficiency of future methods will depend on the use of Computational
Intelligence such as Fuzzy Logic, Ar
tificial Neural networks and Genetic Algorithms. These methodologies form
the foundation on which both biological and electronic intelligence depend.

Three methods to adapt Fuzzy Logic
based systems are available: automatic adjustment of a Fuzzy Set sourc
using a collective variable, dynamic
switching of the Inference
Defuzzification method and adaptive Linguistic
Defuzzification. Each technique provides elasticity to the output generated by a Fuzzy Logic Controller.

Artificial Neural Networks promise tr
ue learning capabilities but are unable on their own to explain the rational
behind their conclusions. As modeling tools, they will likely play an important role in providing rapid pattern
matching capabilities. Holographic Networks bear examination as pot
ential real
time learning tools.

Genetic Algorithms are one of the most efficient optimization techniques available to generate Fuzzy Set
definitions or initial neural network weights.

Integration of these methods into computationally intelligent communi
cation drivers provides a means to
provide accurate symbolic knowledge for processing by conventional AI knowledge bases in a supervisory
fashion. Even data that must be sampled at an extremely rapid rate can be analysed by a conventional Expert
System tha
t uses a data filtering methodology based on Artificial Intelligence.



The author expresses his sincere appreciation to the students who participated in the industrial projects
conducted at UBC over the past few years: Sunil Kumar, Lester Jo
rdon, Paul Benford, Philippe Poirier, Marcello
Veiga, Randy Gurton, Cliff Mui, Ken Scholey, Edgardo Cifuentes and Vladimir Rakocevic. Because of these
students and their outstanding work, the Professor has become the Student. Encouragement and support from

Keith Brimacombe and Indira Samarasekera in times of trouble is also acknowledged. Software support from
Comdale Technologies, Toronto is also gratefully acknowledged.


1. McDermott, K., Clyle, P., Hall, M. and Harris, C.A., 1992. "An Expert

System for Control of No.4
Autogenous Mill Circuit at Wabush Mines",
Proc. Canadian Mineral Processors
, Paper 24, pp.20.

2. Freeman, N., Kemp, T. and Legg, J., 1990. "Development of Operator Guidance System for Pb
Blast Furnace
Proc. Austra
lia AI90 Conf.
, Perth, pp. 11.

3. Eggert, J., Folinsbee, J. and Benford, P., 1994. "SAG Mill Control at Dome Mines Using an Expert
Proc. Canadian Mineral Processors
, Ottawa, pp.16.

4. Benford, P. and Meech, J.A., 1992, "Advising Flotation Oper
ators Using a Real
Time Expert System",
Minerals Engineering
, 5(10
12), 1325

5. Lacouture, B, Russell, C., Griffin, P. and Leung, K., 1991. "Copper Flotation Expert System at Mt. Isa
Mines Limited",
, Eds. Dobby, G.S., Argyropoulos S.A. and

Rao, S.R., 429


6. Edwards, R. and Mular, A.L., 1992. "An Expert System Supervisor of a Flotation Circuit",
CIM Bulletin
84(959), 69

7. Meech, J.A. and Kumar, S., 1994. "A HyperManual on Expert Systems

Version 3.0"
, Ott
awa, 4723 electronic pages.

8. Harris, C.A. and Kosick, G.A. 1988. Expert System Technology at the Polaris Mine.
Proc. Canadian Mineral
Processors Conf
, Ottawa, pp.25.

9. Poirier, P. and Meech, J.A., 1993. "Using Fuzzy Logic for On
Line Trend Analysis
Proc. 2nd IEEE
Conference on Control Applications
, Vancouver, Vol. 1, 83

10. Poirier, P., Raabe, H. and Meech, J.A., 1993. Using Froth Identification in an Advisory Expert System for
Copper Flotation Operations.
Proc. Canadian Mineral Processors Co
, Ottawa, No. 36, pp.14.

11. Meech, J.A., Harris, C.A., 1992. "Expert Systems for Gold Processing Plants";
Randol Gold Forum
Vancouver, B.C., 45

12. Harris, C.A. and Meech, J.A., 1987. Fuzzy Logic: A Potential Control Technique for Mineral Processi
CIM Bulletin
, 80 (905), 51

13. Rakocevic, V.M., Meech, J.A., Kumar, S., Gurton, R., Samarasekera, I.V. and Brimacombe, J.K., 1994.
"Intelligent Computation in a SCADA System for Continuous Casting of Steel Billets",
Proc. CIM Dist.6
, Va
ncouver, 53

14. Bezdek, J.C., 1994. "What is computational intelligence?",
Computational Intelligence

Imitating Life
Proc. World Congress on Computational Intelligence, Orlando, Fla., IEEE Publications, 1

15. Meech, J.A., Jordon, L.A., 1993. "D
evelopment of a Self
Tuning Fuzzy Logic Controller",
., 6(2), 119

16. Smith, M.H., 1993. "Parallel Dynamic Switching of Reasoning Methods in a Fuzzy System",
Proc. 2nd IEEE
Inter. Conf. on Fuzzy Systems (FUZZ
, pp.968

17. Smith, M.H., Takagi, H., 1993. "Optimization of Fuzzy Systems by Switching Reasoning Methods
Inter. Conf. on Fuzzy Systems
, Seoul, Korea, pp.8.

18. Veiga, M.M., Meech, J.A., 1994. "HgEx

A Heuristic System on Mercury Pollution in the Ama
Air & Soil Pollution
, 14, 204

19. Veiga, M.M., Meech, J.A., 1994. "Application of Fuzzy Logic to Environmental Risk Assessment",
Proc. IV
Encuentro Hemisferio Sur sobre Tecnologia Mineral
, Concepcion, Chile, 1994., pp.17.

20. Meech, J.A.,

Veiga, M.M., Hypolito, R., 1995. ""Educational Measures to Address Hg Pollution from Gold
Mining Activities in the Amazon",

(in press), pp. 15.

21. Veiga, M.M., Meech, J.A., 1994. "Heuristic Approach to Mercury Pollution in the Amazon",
national Sym. on Extraction and Processing for the Treatment and Minimization of Wastes
, 123rd
Congress of TMS, San Francisco, CA, 23

22. Zeidenberg, M., 1990. "
Neural Networks in A.I.
", Ellis

Horwood Ltd., W. Sussex, England.

23. Touretzky, D.S., Hint
on, G.E., 1988. "A Distributed
Connectionist Production System",
, 12(3), 423



24. Blum, A., 1992. "
Neural Networks in C++
", Wiley Professional Computing Series., New York, NY.

25. Rakocevic, V.M. and Meech, J.A., 1995. "An Artificia
l Neural Network to Interpret Froth Images from a
Flotation Process",
CIM Bulletin
(in press), pp.16.

HNet Professional Users Manual
, 1992. AND Corporation, Hamilton, Ontario, L8P 1C8

27. Soucek, B. (Ed.), 1993. "
Fuzzy, Holographic and Parallel Intelli
", Wiley & Sons, New York, NY.

28. Goldberg, D., 1989. "
Genetic Algorithms in Search, Optimization & Machine Learning
", Addison
Reading, Mass., 1989.

29. Kumar, S., Meech, J.A., Samarasekera, I.V., Brimacombe, J.K., 1993. "An expert system to
quality problems in continuous casting of steel billets",
Iron & Steelmaker
, ISS, 20(9), 29