Parallella

Above: Picture of my Parallella on top of a Corsair Blue LED fan. Taken on 7/10/2019

Above: top command running on Parallella running Ubuntu 20.04.

Updated (12/12/2023)


Links

https://markdewing.github.io/blog/posts/building-a-parallella-cluster/

https://sites.google.com/a/complexsys.info/scattershot/home/parallella-raspberry-pi-cluster-computing

http://parallellagram.org/

https://www.parallella.org/2017/04/26/openshmem-for-epiphany/

http://blog.codu.in/parallella/epiphany/bulk/cpp/2016/05/06/parallella_cpp.html

https://github.com/eliaskousk/parallella-riscv

Benchmark

Below: Linpack (linpackc) Result on ARM cores on Parallella. (Code from: https://github.com/2000nickels/linpackc )

Result: Rolled Double Precision 98801 Kflops; 1000 Reps


Hello World on Parallella using gcc: (added on 2/15/2022)

Above: Classic C Hello World from: https://developer.arm.com/documentation/dui1093/a/Getting-Started/Compiling-a-Hello-World-example

You can compile by gcc helloworld.c

Then type ./a.out in the shell to run the compiled program. It should display "Hello World" on your shell.

Above: Geometric Decomposition on Epiphany.

Code from: https://github.com/mesham/epython/blob/master/docs/tutorial3.md

2020/2/23

Today I built the Parallella Lisp (plisp) It compiled fine with my Ubuntu 16.04 Parallella box.

Some test information: 

> (  reverse (  quote (  1  2  3  4  5  6  7  ) ) )

(  7  6  5  4  3  2  1  )

> (  testfun  101  )

 202 

> (  any  numberp (  quote (  a  b  ) ) )

 nil 

processor id:    15

memory:   11700480

node size:   16

nnodes:   10967

nodemem:   175472

nnames:   1563

namemem:   62520

nstrings:   0

stringmem:   0

setflag message:  Exited normally!


Updated: 5/11/2019

I got the Ubuntu 16.04 running on my Parallella, by first upgrading to Ubuntu 15.10 then to 16.04. It took several hours in total, making sure my ssh connection didn't die. I tried the eMesh Epiphany sample, and it compiled fine.

Some Parallella codes I wrote / modified.

Simple Code in Epiphany Basic (ebasic)

/*

Simple Printing and Core Identity and Loop Sample

By Tsubasa Kato 2015

*/

a=0

for i=1 to 10

a = a + 2

print a+" "+coreid

next

ePython script (Written on 2016-10-22)

import parallel

a=0

b=0

if coreid() > 0:

        a=sendrecv(coreid(), coreid()-1)


if coreid() < numcores()-1:

        b=sendrecv(coreid(), coreid()+1)

print "Values are "+a+" and "+b

c = sqrt(a)

d = sqrt(b)

e = c + d

print(e)

7/22/2018:

Me maintaining Parallella for use in 2018. vivid -> xenial . Working ok for now. Fingers crossed.

Parallella Ray Tracing Demo I run on the HDMI version.

Photo on left: Parallella Micro Server x 2 in Black box.

9/6/2018:

I've been experimenting with various ways to cool the Parallella since the summer was very hot this year. Below is a photo of one of the version. (The Parallella in a blue case)

4/30/2018:

I am planning to get the Parallellas up and running again in my lab. I want to be able to use it as a low power but many core experiment environment.

1/18/2018:

The other day I powered up my Parallella and connected it from my desktop computer. Didn't do much, just configured so it has static IP address. I need to make it so the other Parallella will have static IP address as well.

12/27/2017:

Searched about possible application of big data and Parallella and found one site. (refer above, CODUIN)

4/26/2017: Since I've been busy, I haven't been playing with the Parallella for a while, but I intend to; to make sure my skills don't get rusty. 

4/8/2017: I measured the temperature using ztemp.sh and the temperature for both Parallella Micro Servers were 43-44.5 degrees Celsius recently. This is probably because it's getting warmer here in Japan. I am thinking of purchasing a fan that operates out of USB. 

3/26/2017: I enclosed the other Parallella Micro Server in aluminum case. It took a while for it to be put in, but the temperature is stable at around 41.5-44.0 degrees Celsius. 

3/21/2017: I enclosed one Parallella Micro Server inside the aluminum case available from Amazon.com . I am going to enclose another one soon.

Parallella with a fan. The fan is an old Intel fan connected to a USB port cable via an adapter. Cooled quite well. 

Parallella epython Mandelbrot example running (video)

Below: Me testing a script (emesh_bandwidth_all2one). 

The result of the all-to-one on-chip communication is 3889.00MB/s.