Experiments with Interaction on Arduino

Purpose of this Site

This page focuses on demonstrating pattern input and output with a physical experiment.

While in "On Logical Triangulation" and "On Patterns and Learning through Interaction", a foundation was given of how thinking may be undertaken, this paper shows how an artificially intelligent agent, referred to also as the "system", may interact with its environment - by input and output of patterns.

An example implementation of a simple artificial intelligence on an Arduino UNO R3 is given - it is a kind of "blinkenlights chatbot". It receives blinkenlights patterns (through pushbuttons and connected to them, light emitting diodes) and answers with LED patterns of its own.

Wiring of the Arduino Artificial Intelligence

It operates according to the following principles.

PRINCIPLES OF INTERACTION

1. A strong artifical intelligence does in my view nothing more than this:

a) - It receives a pattern by its input sensors;

b) - it "reasons" (by whatever mechanism - neural nets, Bayesian, logical, or my own Logical Triangulation or Fragmentation Sets) on the pattern and establishes "analogies" - operating on analogies is in my view the core requirement of an artificial intelligence;

c) - it memorizes (or "learns") the pattern as an appropriate response for a given situation - namely, the situation in which the pattern was observed;

d) - it outputs a response that has been learned as "suitable" for a sitation that is similar to the perceived input pattern.

2. In order to connect with the outer world and interact with it, an artificial intelligence needs this:

a) - An automatic drive to process information - whatever is supplied, will be processed;

b) - input of information by means of sensors - when a sensor perceives something, it generates a "thought" for further processing (when you see something green, you see it not because you want to - but because your eyes make that though appear in your mind);

c) - output of information by means of actors - when a certain "thought" is entertained, the system acts out a corresponding action (when you actually trigger finger movement in order to graps something, your fingers simultaneously go ahead and move);

d) - instincts, i.e. primordial relations, that connect some inputs to some actions. Beginning and based on these, other "appropriate" actions can be learned by the system just by its chosen internal reasoning mechanism.

All in all, it is shown that perceptions need to be placed into the system with sensors, and the system can react on perceptions with actors, these again can be connected by initial instincts; and by reasoning, in whatever way it is undertaken, a general artificial intelligence should be in the position to "figure out" how to act on input with suitable output.

I wish you to have fun - particularly with the Arduino blinkenlights machine intelligence!

- Nino

oninteraction20150511.pdf - the main document

blinkensource.cpp - my Arduino sketch source

Here you have two videos explaining the concepts in a simplified way: