Background
The motion simulation of the deployable strucuture can be done with the help of target-shoot method and the accuracy can be up to 1e-12. Thus, we get the global coordinates of the deployable structure at any time. It is enough to simulate the location of the structure bodies but not to add more rendering unluckily after all MATLAB is not for this.
So here is the problem: How to LINK the MATLAB(where we do computing) and Blender(where we do rendering)?
Something about Blender
Blender is famous of its features of open-source and how friendly it is for Python users. That is right! Any move in the Blender can be replicated with the help of Python scripts. The normal method is to refer to the Blender Document. One trick is taking the information hint on the Info window of the Scripting Tab. So keep in mind.
Blender has its body locating and rotating system with intuitive parameters while the rotation center of the body is at its origin point implicitly. The 'Location X, Y, Z' standing for the coordinate of its origin point and 'Rotation X, Y, Z' standing for the rotating angle(degree but not radian) around X, Y, Z axis respectively.