Parallella

Above: Picture of my Parallella on top of a Corsair Blue LED fan. Taken on 7/10/2019


Updated (7/10/2019)

Links

  • Building a Parallella Cluster by Mark Dewing

https://markdewing.github.io/blog/posts/building-a-parallella-cluster/

  • Parallella/Raspberry Pi Cluster Computing - scattershot

https://sites.google.com/a/complexsys.info/scattershot/home/parallella-raspberry-pi-cluster-computing

  • Parallellagram

http://parallellagram.org/

  • OpenSHMEM for Epiphany

https://www.parallella.org/2017/04/26/openshmem-for-epiphany/

  • Using C++ to program the Parallella - CODUIN

http://blog.codu.in/parallella/epiphany/bulk/cpp/2016/05/06/parallella_cpp.html

  • RISC-V port to Parallella Board

https://github.com/eliaskousk/parallella-riscv

Benchmark

Below: Linpack (linpackc) Result on ARM cores on Parallella. (Code from: https://github.com/2000nickels/linpackc )

Result: Rolled Double Precision 98801 Kflops; 1000 Reps


Updated: 5/11/2019

I got the Ubuntu 16.04 running on my Parallella, by first upgrading to Ubuntu 15.10 then to 16.04. It took several hours in total, making sure my ssh connection didn't die. I tried the eMesh Epiphany sample, and it compiled fine.

Some Parallella codes I wrote / modified.

Simple Code in Epiphany Basic (ebasic)

/*
Simple Printing and Core Identity and Loop Sample
By Tsubasa Kato 2015
*/
a=0
for i=1 to 10
a = a + 2
print a+" "+coreid
next

ePython script (Written on 2016-10-22)

import parallel
a=0
b=0
if coreid() > 0:
        a=sendrecv(coreid(), coreid()-1)

if coreid() < numcores()-1:
        b=sendrecv(coreid(), coreid()+1)
print "Values are "+a+" and "+b
c = sqrt(a)
d = sqrt(b)
e = c + d
print(e)

7/22/2018:

Me maintaining Parallella for use in 2018. vivid -> xenial . Working ok for now. Fingers crossed.

Parallella Ray Tracing Demo I run on the HDMI version.

Photo on left: Parallella Micro Server x 2 in Black box.

9/6/2018:

I've been experiementing with various ways to cool the Parallella since the summer was very hot this year. Below is a photo of one of the version. (The Parallella in a blue case)

4/30/2018:

I am planning to get the Parallellas up and running again in my lab. I want to be able to use it as a low power but many core experiment environment.

1/18/2018:

The other day I powered up my Parallella and connected it from my desktop computer. Didn't do much, just configured so it has static IP address. I need to make it so the other Parallella will have static IP address as well.

12/27/2017:

Searched about possible application of big data and Parallella and found one site. (refer above, CODUIN)

4/26/2017: Since I've been busy, I haven't been playing with the Parallella for a while, but I intend to; to make sure my skills don't get rusty.

4/8/2017: I measured the temperature using ztemp.sh and the temperature for both Parallella Micro Servers were 43-44.5 degrees Celsius recently. This is probably because it's getting warmer here in Japan. I am thinking of purchasing a fan that operates out of USB.

3/26/2017: I enclosed the other Parallella Micro Server in aluminum case. It took a while for it to be put in, but the temperature is stable at around 41.5-44.0 degrees Celsius.

3/21/2017: I enclosed one Parallella Micro Server inside the aluminum case available from Amazon.com . I am going to enclose another one soon.

Parallella with a fan. The fan is an old Intel fan connected to a USB port cable via an adapter. Cooled quite well.

Parallella epython Mandelbrot example running (video)