For development, job submission uses a single account on the compute server and uses the following directory structure:
For each user the SciPortal requires information about the users authorization and authentication on the compute servers. For example, consider the current entries for user jkl
origin.osc.edu 192.148.248.24 pse /home/pse/jkl media1.osc.edu 192.148.249.51 pse /home/pse/jkl
In addition, information about the applications is obtained from the WebFlow Abstract Application Descriptor(AAD).
To execute an application, the information above is used to convert a script template to an actual job submission script which can run the job.
The status of a job will be monitored, not by polling but by the job signaling its status to the server. There is a be a servlet on the server, operating on an unprotected port, which listens to an event notification from the jobs running on the back end machine. The format of these notifications will be an HTTP GET request of the form:
A small utility which is called the notifier is installed on the backend machine (compute server), which can send the notification request to the listener. This listener can be as simple as wget:
or it can be a PERL script which takes the URL from the command line, makes a request to the Web Server for the document (obviously, the document being requested is in this case irrelevant, and the servlet can just return an OK or something to make sure HTTP protocol is followed.
At this time all communication is unencrypted.The HTTP GET request example is
In this case, the listener servlet will need to decrypt the information first before processing it. Note, that the backend does not do any encryption, and does not need to have any encryption software and/or encryption information/keys to do it. Moreover, to restrict hackers only to the users of back ends, the port on which listener listens can be easily configure to reject all requests which come from hosts different than back and machines.
When the project is started by the user, and a job submission is being prepared, the job submission script template is updated to have the correct information in the arguments of notifier calls. Consider a simple case:
/usr/local/bin/notifier?user=jkl?job=myjob123&event=started ..... the actual work is being done here /usr/local/bin/notifier?user=jkl?job=myjob123&event=finished
Just before job is submitted to the backend, the jobSubmission manager writes to the file /..../Descriptors/User/jkl/projectId/status a line:
followed by an output from ssh command listing the job Id on the backend as given by NQS together with the error/status messages. When NQS executes the job, the notifier send the event "started" to the listener on the server. When the listener receives the notification, it writes the status and other information to the file, and the applet which monitors the status of the job can also receive this information.
When job ends, WebFlow on the server copies files from the backend to
and appends status file.