Difference between revisions of "Deploying to PlanetLab"

From NMSL
 
(One intermediate revision by one other user not shown)
Line 11: Line 11:
 
Replace ''slice_name'' and ''yourapp.tar'' with the appropriate values.
 
Replace ''slice_name'' and ''yourapp.tar'' with the appropriate values.
  
If you are a Linux user, the Ubuntu package to install is the '''pssh''' package. This package contains several commands:
+
If you are a Linux user, the Ubuntu package to install is the '''pssh''' package. This package contains several commands (more information on these commands and their usage can be found in [http://www.theether.org/pssh/docs/0.2.3/pssh-HOWTO.html this] HowTo):
 
* Parallel ssh (parallel-ssh, upstream calls it pssh), executes commands on multiple hosts in parallel
 
* Parallel ssh (parallel-ssh, upstream calls it pssh), executes commands on multiple hosts in parallel
 
* Parallel scp (parallel-scp, upstream calls it pscp), copies files to multiple remote hosts in parallel
 
* Parallel scp (parallel-scp, upstream calls it pscp), copies files to multiple remote hosts in parallel
Line 51: Line 51:
 
<pre>
 
<pre>
 
pssh -h nodes.txt -l sfu_games -p 200 -t 600 -o /tmp/test1 "wget -nc http://nsl.cs.sfu.ca/tmp/jre-linux.bin"
 
pssh -h nodes.txt -l sfu_games -p 200 -t 600 -o /tmp/test1 "wget -nc http://nsl.cs.sfu.ca/tmp/jre-linux.bin"
 +
</pre>
 +
 +
Some nodes do not have "wget", you can use the direct copy method.
 +
<pre>
 +
pscp -h nodes.txt -l sfu_games -t 1800 -p 10 jre-linux.bin /home/sfu_games/jre-linux.bin
 
</pre>
 
</pre>
  

Latest revision as of 21:23, 1 January 2010

Deploying Files to PlanetLab Nodes

To deploy your application/file(s) to PlanetLab nodes, you need the parallel secure copy command. For Mac users, use the command below:

pscp -h ''nodes.txt'' -l slice_name -t 180 -p 200 yourapp.tar /home/slice_name/yourapp.tar

Replace slice_name and yourapp.tar with the appropriate values.

If you are a Linux user, the Ubuntu package to install is the pssh package. This package contains several commands (more information on these commands and their usage can be found in this HowTo):

  • Parallel ssh (parallel-ssh, upstream calls it pssh), executes commands on multiple hosts in parallel
  • Parallel scp (parallel-scp, upstream calls it pscp), copies files to multiple remote hosts in parallel
  • Parallel rsync (parallel-rsync, upstream calls it prsync), efficiently copies files to multiple hosts in parallel
  • Parallel nuke (parallel-nuke, upstream calls it pnuke), kills processes on multiple remote hosts in parallel
  • Parallel slurp (parallel-slurp, upstream calls it pslurp), copies files from multiple remote hosts to a central host in parallel


The -t switch denotes the timeout period for each node (in seconds). For example, -t 180 sets the pscp command timeout period to 3 minutes. A "Timeout" status will be returned if the node failed to response within the timeout period.


The -p 200 denotes the max number of parallel threads for the pscp process. A value of 200 means the process will attempt to copy the file to 200 nodes concurrently. Please do not abuse this setting. You will run out of memory on your local machine if you spawn too many threads.


Executing Remote Commands

You can use pssh to issue a command to all nodes in your slice. For example, the command below will untar yourapp.tar from the above example.

pssh -h nodes.txt -l sfu_games -t 180 -p 200 -o /tmp "tar xvf yourapp.tar"


Installing Java

Below is a series of commands used to deploy Java on PlanetLab.

The content of install_jvm.sh

#!/bin/sh
echo -e "q\nyes\n" | sudo sh jre-linux.bin | grep "\.\.\."

Commands

Download the Linux binary for the Java Runtime Environment

pssh -h nodes.txt -l sfu_games -p 200 -t 600 -o /tmp/test1 "wget -nc http://nsl.cs.sfu.ca/tmp/jre-linux.bin"

Some nodes do not have "wget", you can use the direct copy method.

pscp -h nodes.txt -l sfu_games -t 1800 -p 10 jre-linux.bin /home/sfu_games/jre-linux.bin

Copy the installation shell script to all the nodes in the slice

pscp -h nodes.txt -l sfu_games -t 180 -p 200 install_jvm.sh /home/sfu_games/install_jvm.sh

Make the shell script executable on all the nodes

pssh -h nodes.txt -l sfu_games -p 200 -o /tmp chmod 755 /home/sfu_games/install_jvm.sh

Execute the shell script on all the nodes of the slice

pssh -h nodes.txt -l sfu_games -p 200 -t 300 -o /tmp ./install_jvm.sh