Robots and Cognitive Systems 

Recently I've been getting more interested and involved in robotics technologies. I think part of this is just to try to get a break from the semantic paradigm I've spent most of my professional career working within, and because at least to some extent I've become disillusioned with it: I don't think we're going to understand intelligence by building semantics in, but rather by watching semantics develop. 

I've been researching various robotic architectures, though not a lot has changed since I helped develop a robotics exploration class at U Rochester in the early 90s. The traditional sense-plan-act loop beloved of AI in the 70s and 80s is pretty much dead now, replaced by 1) Brooksian reactive/"behavioral" architecture (which is basically sense-act - no KR so no planning), 2) the "three-layer" architecture that augments a reactive layer (to allow real-time response) with a mission/planning top layer and a binding layer in between that essentially enables and disables the reactive behaviors. 

3) On a more "cognitive" branch, there's been some attempts at using traditional production system type cognitive architecture such as Soar and ACT-R for robotics. 

Most intriguing to my mind is work by Wei-Min Shen at USC ISI - on robots that learn from their interaction with the environment, and thus locomotive skills can be learned from arbitrary hardware configurations. This is getting close to the kind of self-development of semantics I'm interested in as well, and I hope they are successful in growing the applications of this technology.

In order to do some of this exploration on my own time (rather than burden my employer with the kind of training pretty much any robotics engineer should be graduating with), I'm in the process of putting together some basic robotics development tools at home (See the Home Robotics link for more).