Logging Data using Python

Oftentimes it is useful to output data from our haptics simulation. This can be useful for analysis of hand motions, button triggers, and other events. Another example is object avoidance analysis. Every shape has an "isTouched" property that can be routed into Python and thereby logged.

Alternatively there is a DeviceLog node. We are not going to discuss this here. Some information can be found on it's H3D Doxygen Page. This allows you to log information from the haptic device, but not from other scene data such as "isTouched."

In our example we are going to compare the effect of constraining user motion while tracing a sphere. This advances on concepts we discussed in Motion Constraints Using Python. You should be capably of developing haptically enabled primitive geometries.

The idea is that we will have someone attempt to trace about a sphere once with no haptic guidance and another time with motion constraints that keep them about a vertical plane. This constrains them from moving left and right. Our model is below. The black "paint" shows where the user has traced. It is created using the PaintableTexture node included and documented in the "Various Nodes" H3D package. For sake of this tutorial we will not include Paint.

The output of our project will result in a graph like the following which compares the unconstrained and constrained tracings. The data will be processed using Matlab. Alternatively processing could be done in a variety of languages, including Python using the SciPy libraries for analysis and plotting.

Figure: (Red) Constrained hand motion (Blue) Unconstrained hand motion


Our scene should contain general setup information plus three sections: geometry, force effects, and routing information.

We also introduce a node to change the background color. For the purpose of increasing contrast and aesthetics we will make the background white. "groundColor" and "skyColor" take RGB inputs, similar to how appearance was set previously. Additional properties include ground angle, sky angle, and texture inputs on the top, bottom, left, and right.

For example if we wanted a green background and red sky we would input:

<Background groundColor="1.0 0.0 0.0" skyColor = "0.0 1.0 0.0"/>

To keep the end effector constrained along the central plane we will use the Position function effect and follow the same principles as discussed in the motion constraints tutorial. Multiple force effect nodes are kept so you can do analysis about each axis.

This example uses a basic sphere for the user to trace around.

Routes should be sent to a Python script for force effects, position data, and haptic button data, and should receive data for the force effect data. The main button will be used to enable force feedback and the secondary button is used to stop Python from logging new data. Remember to get the haptic device data we must import H3D_Exports at the beginning of our scene file.

The following code should be easy to understand and implement if you have been following the set of tutorials.

Note: if you are copy and pasting the code then you must change the Python script location to reflect where the files are on your computer.

<!-- Import device data -->
<IMPORT inlineDEF="H3D_EXPORTS" exportedDEF="HDEV" AS="HDEV" />

<!-- Set background to white -->
<Background groundColor="1.0 1.0 1.0" skyColor = "1.0 1.0 1.0"/>

<!-------------Force effects------------->

<!-- Used to constrain user motion -->
        <GeneralFunction DEF="xFun" containerField="xFunction" function="0" params="x,y,z"/>
        <GeneralFunction DEF="yFun" containerField="yFunction" function="0" params="x,y,z"/>
        <GeneralFunction DEF="zFun" containerField="zFunction" function="0" params="x,y,z"/>


<!-- Object creation -->
    <Transform DEF="transform" rotation="1 .0 .0 .5">
        <Sphere DEF="BOX" radius="0.1"/>
            <Material diffuseColor="1 0 0"/>


<!-- Restrict user motion and log data via Python -->   
<PythonScript DEF="Restrict" url="C:\H3D\CLEA\Constrain\line_restrict_log.py"/>

    <ROUTE fromNode= "HDEV" fromField="trackerPosition" toNode="Restrict" toField="position"/>

    <!-- If main button is pressed constrain the user's motion-->
    <ROUTE fromNode= "HDEV" fromField="mainButton" toNode="Restrict" toField="x_but"/>

    <!-- If alternate buttons are pressed we want to stop logging data -->
    <ROUTE fromNode= "HDEV" fromField="secondaryButton" toNode="Restrict" toField="close_logger"/>
    <!-- If button is pressed we want to stop logging data -->

    <!-- Send function to the position function effect to constrain the users motion -->
    <!-- Left and right motion is restricted -->
    <ROUTE fromNode="Restrict" fromField="x_but" toNode="xFun" toField="function"/>


Saving data with Python

In order to do analysis of our data we must log the position data within the Python script. To write to a file we first open a blank text file using the format

variable = open('filename', 'access type')

The access type can be read ('r'), write ('w'), append ('a'), or read and write ('r+'). The variable name will be used later to actually write data.

We will have two classes pertinent to the logger: "logger" and "closeLogger." This may not be the best programming practice but is easier to accomplish considering we are using separate inputs from H3D for each of the actions.

The logger class takes in the position information and formats it in a string using the format "x y z" for each time step. This data is appended to the previously opened data file using the f.write(x) command.

If the secondary button is pressed then the logger file is closed and a message is displayed in the dialog window.

from H3DInterface import *

#file to save data
f=open('C:/Users/colin/Desktop/PosterImages/constrain/data.txt', 'w')
print f

class X_fun( TypedField(SFString, SFBool)):
    def update(self, event):
        routes_in = self.getRoutesIn()
        button = routes_in[0].getValue()
        print "f"
            return "-300*x"
            return "0"

class logger(AutoUpdate(SFVec3f)):
    def update(self, event):
        routes_in = self.getRoutesIn()
        position = routes_in[0].getValue()
        x = str(position.x) + " " + str(position.y) + " " + str(position.z) + '\n'
        f=open('C:/Users/colin/Desktop/PosterImages/constrain/data.txt', 'a')
        y = SFVec3f()
        return y

#closeLogger() used to stop logging data
class closeLogger(AutoUpdate(SFBool)):
    def update(self, event):
        routes_in = self.getRoutesIn()
        print "file closed"
        y = SFBool(0)
        return 0

x_but = X_fun() #constrains user motion
position = logger() #log position data
close_logger = closeLogger() #close logging file

Data Processing

Processing is done in Matlab to extract and analyze lateral deviations in the user's control. Information about reading text into Matlab can be found in the help files regarding fopen and fscanf.

Deviation is measured by finding the mean and standard deviation over all the data points. The deviation can be calculated using :

standard dev = sqrt(1/N * SUM ( mean - val)^2 )

Where N is the number of samples, mean is an average of all the data points, val is the individual value of a data point, and SUM is a function to add the variance at each data set.

For our purposes on the lateral (y-axis) is analyzed. Data from both the constrained and unconstrained runs should be extracted using this Matlab file. The "save" command saves all of the variables declared to a .mat file used later for plotting.

% Process Data in Matlab.


close gcf
clear all

% Open the haptic data and store it all into a variable 'A'. Then close the
% file.
% data_un should be changed to data_const for constrained data.

file = fopen('/data_un.txt', 'r');
A = fscanf(file, '%g %g %g', [1 inf]);

%find the length of our dataset
num_positions = length(A)/3;
%pos = zeros(num_positions, 4);
trig = 0;

%We found that the data we are interested in does not start until index 185
k = 185; %starting index
pos = zeros(1,4) %set position variable to correct form [1x4] vector

% read in the position data
for i=k:num_positions-1
    j = i*3+1;
    pos(i-k+1,:)=[i-k, A(j), A(j+1), A(j+2)];
%     After the user completes the circle there will be an excessive amount
%     of repeating values showing the device not moving. Stop reading data
%     at this point.

%     To avoid an accidental error where the position stays the same, we
%     wait until there are 10 of the same positions in a row
        if (pos(i,2:4) == pos(i-1,2:4))
            trig = trig+1;
            trig = 0;
    if trig>10
%convert from 'units' to meters
pos(:,2) = pos(:,2) * (.1016 / 0.3563);
pos(:,3) = pos(:,3) * (.1016 / 0.3563);
pos(:,4) = pos(:,4) * (.1016 / 0.3563); %4 inches is .1016 meters

%x, y, z as normal
max_pos = length(pos)-9
% zeros(length(pos),1)
axis equal
hold on;

%%%%%%%%%%%%%%%%%% statistics %%%%%%%%%%%%%%%%%%%
%standard dev = sqrt(1/N * SUM ( mean - val)^2 )
mid_y = 0;
y_var = 2; %the y data in pos
for i = 1:max_pos
    mid_y = mid_y + pos(i,y_var);
mid_y = mid_y/max_pos

dev = zeros(max_pos, 1);
sum_dev = 0;
for i = 1:max_pos
    dev(i) = pos(i,y_var) - mid_y;
    sum_dev = sum_dev + dev(i)^2;

standard_dev = (sum_dev/(max_pos-1))^.5; %in meters
sd_cm = standard_dev*100; %in cm

var = sum_dev/(max_pos-1); % in meters
var_cm = var*100; % in cm

max_left_dev = (-min(pos(:,y_var)) + mid_y)*100;
max_right_dev = (-max(pos(:,y_var)) + mid_y)*100;

fprintf('Standard deviation (cm): %f \n', sd_cm);
fprintf('Variance (cm): %f \n', var_cm);
fprintf('Max Deviation, Left: %f Right: %f \n', max_left_dev, max_right_dev);
plot(pos(1:max_pos,1), pos(1:max_pos,y_var), '-r')

save %saves the data/workspace to a .mat file

Plotting Data

We make another Matlab file to plot our constrained and unconstrained data. The two trials may have differing numbers of position data thus it makes sense to compare the deviation as a function of rotation about the sphere. It is assumed that the user is moves at approximately the same rate for the whole duration. Thus means that we can relate the individual position data compared to the number of data points in order to normalize our set, and then multiply by 2*PI to approximate the amount of rotation.

%Compare the unconstrained and constrained datasets in Matlab


clear all
close gcf

% Load constrained data
%data previously stored in variables 'pos' and 'mid_y'
load constrained_madu.mat

%Put the constrained data into new variables
data_con = pos;
mid_con = mid_y;

% Load unconstrained data
load unconstrained_madu.mat

%Put the unconstrained data into new variables
data_uncon = pos;
mid_uncon = mid_y;

%find the length of the two datasets
len_con = length(data_con);
len_un = length(data_uncon);

% plot the two data sets; we want to transform the data from 0 to 2pi
% rather than 0 to time so we transfor the 'x' value in the plot fcn
plot(data_con(:,1)*2*pi/len_con, data_con(:,2)-mid_con, '-r');
hold on;
plot(data_uncon(:,1)*2*pi/len_un, data_uncon(:,2)-mid_uncon, '-b');
hold on;

axis([0 2*pi -.015 .015])

After playing around with the Matlab plot we get this graph: