Having the “Gerlach card” running, I was looking for ways to create something Transputers were made for: A farm, grid, network, cluster – call it what you like.
By a lucky incedent I was able to make contact to some people at DESY, which is one of the world’s leading accelerator centres. DESY develops, builds and operates large accelerator facilities, which are used to investigate the structure of matter. It’s comparable to the CERN accelerator in Switzerland.
I’ve learned that they were on schedule to switch-off a part of their accelerator, namely ZEUS (German), as all the planned research-protocols were finished… and I’ve learned that they use(d) several transputers for data aquisition and real-time analytics!
So after 2 years of shmoozing and sending ASCII-art flowers in mails, I was allowed to give their transputers a new home (else they would have been destroyed – oh my!).
I was surprised to see, that there was not the amount of transputers used as I would have expected (hundreds?). So no wonder they were able to replace the transputers by one single Linux box for the last year of the project, running the transputers as hot-standby backup.
But I also got all their spare-parts and everything else… a good start.
The original system consisted of 12 transputers, each on a TRAM, 4 of those sitting on a custom made TRAM-board called “TRAMWAY” which looks like this:
It’s not really worth calling it a board – It’s mainly a TRAM carrier with RS422 drivers for each link. The links for the TRAMs are hardwired, so TRAM1 has one “down link” (i.e. from the host or another card) connected to link0. Link1, 2 & 3 are connected to TRAM1, 2 & 3 respectively.
6 of those TRAMWAYs are currently sharing a case making it The Tower Of Power:
As you can see I stacked some TRAMs (Size-1 TRAM on a size-2 TRAM) to squeeze in the maximum number of TRAM… power to the tower, man!
Here’s another one showing the ToP with its host, an Intel LP486:
This is how the full power looks like in ‘ispy’:
Using 150 ispy 3.23 | mtest 3.22
# Part rate Link# [ Link0 Link1 Link2 Link3 ] RAM
0 T425b-20 239k 0 [ HOST … … 1:0 ] 136K.
1 T805b-25 1.5M 0 [ 0:3 2:0 3:0 4:0 ] 4100K;
2 T800d-20 1.8M 0 [ 1:1 5:0 6:0 … ] 1028K;
3 T800d-20 1.8M 0 [ 1:2 7:0 8:0 … ] 1028K;
4 T800d-20 1.8M 0 [ 1:3 9:0 … … ] 1028K;
5 T800d-20 1.6M 0 [ 2:1 10:0 11:0 12:0 ] 1028K;
6 T800d-20 1.6M 0 [ 2:2 13:0 14:0 15:0 ] 1028K;
7 T800d-20 1.6M 0 [ 3:1 16:0 17:0 18:0 ] 1028K;
8 T425b-20 1.8M 0 [ 3:2 19:0 20:0 21:0 ] 132K;
9 T800d-20 1.6M 0 [ 4:1 22:0 23:0 24:0 ] 1028K;
10 T805d-20 1.8M 0 [ 5:1 … … … ] 1028K;
11 T800c-17 1.8M 0 [ 5:2 … … … ] 2052K;
12 T800c-20 1.8M 0 [ 5:3 … … … ] 1028K;
13 T800d-20 1.8M 0 [ 6:1 … … … ] 1028K;
14 T800d-20 1.8M 0 [ 6:2 … … … ] 132K;
15 T800d-20 1.6M 0 [ 6:3 … … … ] 132K;
16 T800d-20 1.8M 0 [ 7:1 … … … ] 1028K;
17 T800c-17 1.7M 0 [ 7:2 … … … ] 2052K;
18 T800c-20 1.7M 0 [ 7:3 … … … ] 1028K;
19 T425b-20 1.8M 0 [ 8:1 … … … ] 132K;
20 T425a-20 1.8M 0 [ 8:2 … … … ] 132K;
21 T425b-20 1.8M 0 [ 8:3 … … … ] 132K;
22 T805d-20 1.8M 0 [ 9:1 … … … ] 1028K;
23 T800d-25 1.8M 0 [ 9:2 … … … ] 2052K;
24 T800d-20 1.8M 0 [ 9:3 … … … ] 1028K;