Source Code Cross Referenced for Condor.java in  » Workflow-Engines » pegasus-2.1.0 » org » griphyn » cPlanner » transfer » sls » Java Source Code / Java DocumentationJava Source Code and Java Documentation

Java Source Code / Java Documentation
1. 6.0 JDK Core
2. 6.0 JDK Modules
3. 6.0 JDK Modules com.sun
4. 6.0 JDK Modules com.sun.java
5. 6.0 JDK Modules sun
6. 6.0 JDK Platform
7. Ajax
8. Apache Harmony Java SE
9. Aspect oriented
10. Authentication Authorization
11. Blogger System
12. Build
13. Byte Code
14. Cache
15. Chart
16. Chat
17. Code Analyzer
18. Collaboration
19. Content Management System
20. Database Client
21. Database DBMS
22. Database JDBC Connection Pool
23. Database ORM
24. Development
25. EJB Server geronimo
26. EJB Server GlassFish
27. EJB Server JBoss 4.2.1
28. EJB Server resin 3.1.5
29. ERP CRM Financial
30. ESB
31. Forum
32. GIS
33. Graphic Library
34. Groupware
35. HTML Parser
36. IDE
37. IDE Eclipse
38. IDE Netbeans
39. Installer
40. Internationalization Localization
41. Inversion of Control
42. Issue Tracking
43. J2EE
44. JBoss
45. JMS
46. JMX
47. Library
48. Mail Clients
49. Net
50. Parser
51. PDF
52. Portal
53. Profiler
54. Project Management
55. Report
56. RSS RDF
57. Rule Engine
58. Science
59. Scripting
60. Search Engine
61. Security
62. Sevlet Container
63. Source Control
64. Swing Library
65. Template Engine
66. Test Coverage
67. Testing
68. UML
69. Web Crawler
70. Web Framework
71. Web Mail
72. Web Server
73. Web Services
74. Web Services apache cxf 2.0.1
75. Web Services AXIS2
76. Wiki Engine
77. Workflow Engines
78. XML
79. XML UI
Java
Java Tutorial
Java Open Source
Jar File Download
Java Articles
Java Products
Java by API
Photoshop Tutorials
Maya Tutorials
Flash Tutorials
3ds-Max Tutorials
Illustrator Tutorials
GIMP Tutorials
C# / C Sharp
C# / CSharp Tutorial
C# / CSharp Open Source
ASP.Net
ASP.NET Tutorial
JavaScript DHTML
JavaScript Tutorial
JavaScript Reference
HTML / CSS
HTML CSS Reference
C / ANSI-C
C Tutorial
C++
C++ Tutorial
Ruby
PHP
Python
Python Tutorial
Python Open Source
SQL Server / T-SQL
SQL Server / T-SQL Tutorial
Oracle PL / SQL
Oracle PL/SQL Tutorial
PostgreSQL
SQL / MySQL
MySQL Tutorial
VB.Net
VB.Net Tutorial
Flash / Flex / ActionScript
VBA / Excel / Access / Word
XML
XML Tutorial
Microsoft Office PowerPoint 2007 Tutorial
Microsoft Office Excel 2007 Tutorial
Microsoft Office Word 2007 Tutorial
Java Source Code / Java Documentation » Workflow Engines » pegasus 2.1.0 » org.griphyn.cPlanner.transfer.sls 
Source Cross Referenced  Class Diagram Java Document (Java Doc) 


001:        /**
002:         * This file or a portion of this file is licensed under the terms of
003:         * the Globus Toolkit Public License, found in file GTPL, or at
004:         * http://www.globus.org/toolkit/download/license.html. This notice must
005:         * appear in redistributions of this file, with or without modification.
006:         *
007:         * Redistributions of this Software, with or without modification, must
008:         * reproduce the GTPL in: (1) the Software, or (2) the Documentation or
009:         * some other similar material which is provided with the Software (if
010:         * any).
011:         *
012:         * Copyright 1999-2004 University of Chicago and The University of
013:         * Southern California. All rights reserved.
014:         */package org.griphyn.cPlanner.transfer.sls;
015:
016:        import org.griphyn.cPlanner.classes.PegasusBag;
017:        import org.griphyn.cPlanner.classes.FileTransfer;
018:        import org.griphyn.cPlanner.classes.SubInfo;
019:        import org.griphyn.cPlanner.classes.PegasusFile;
020:        import org.griphyn.cPlanner.classes.PlannerOptions;
021:
022:        import org.griphyn.cPlanner.common.PegasusProperties;
023:        import org.griphyn.cPlanner.common.LogManager;
024:
025:        import org.griphyn.cPlanner.transfer.SLS;
026:
027:        import org.griphyn.cPlanner.namespace.VDS;
028:
029:        import org.griphyn.cPlanner.poolinfo.PoolInfoProvider;
030:
031:        import java.io.File;
032:        import java.io.IOException;
033:        import java.io.FileWriter;
034:
035:        import java.util.Iterator;
036:        import java.util.Set;
037:
038:        /**
039:         * This uses the Condor File Transfer mechanism for the second level staging.
040:         *
041:         * It will work only if the Pegasus Style profile ( pegasus::style ) has a value
042:         * of condor.
043:         *
044:         * @author Karan Vahi
045:         * @version $Revision: 449 $
046:         */
047:        public class Condor implements  SLS {
048:
049:            /**
050:             * A short description of the transfer implementation.
051:             */
052:            public static final String DESCRIPTION = "Condor File Transfer Mechanism";
053:
054:            /**
055:             * The handle to the site catalog.
056:             */
057:            protected PoolInfoProvider mSiteHandle;
058:
059:            /**
060:             * The handle to the properties.
061:             */
062:            protected PegasusProperties mProps;
063:
064:            /**
065:             * The handle to the planner options.
066:             */
067:            protected PlannerOptions mPOptions;
068:
069:            /**
070:             * The handle to the logging manager.
071:             */
072:            protected LogManager mLogger;
073:
074:            /**
075:             * The default constructor.
076:             */
077:            public Condor() {
078:            }
079:
080:            /**
081:             * Initializes the SLS implementation.
082:             *
083:             * @param bag the bag of objects. Contains access to catalogs etc.
084:             */
085:            public void initialize(PegasusBag bag) {
086:                mProps = bag.getPegasusProperties();
087:                mPOptions = bag.getPlannerOptions();
088:                mLogger = bag.getLogger();
089:                mSiteHandle = bag.getHandleToSiteCatalog();
090:            }
091:
092:            /**
093:             * Returns a boolean whether the SLS implementation does a condor based
094:             * modification or not. By condor based modification we mean whether it
095:             * uses condor specific classads to achieve the second level staging or not.
096:             *
097:             * @return false
098:             */
099:            public boolean doesCondorModifications() {
100:                return true;
101:            }
102:
103:            /**
104:             * Constructs a command line invocation for a job, with a given sls file.
105:             * The SLS maybe null. In the case where SLS impl does not read from a file,
106:             * it is advised to create a file in generateSLSXXX methods, and then read
107:             * the file in this function and put it on the command line.
108:             *
109:             * @param job          the job that is being sls enabled
110:             * @param slsFile      the slsFile can be null
111:             *
112:             * @return invocation string
113:             */
114:            public String invocationString(SubInfo job, File slsFile) {
115:                return null;
116:            }
117:
118:            /**
119:             * Returns a boolean indicating whether it will an input file for a job
120:             * to do the transfers. Transfer reads from stdin the file transfers that
121:             * it needs to do. Always returns true, as we need to transfer the proxy
122:             * always.
123:             *
124:             * @param job the job being detected.
125:             *
126:             * @return false
127:             */
128:            public boolean needsSLSInput(SubInfo job) {
129:                return false;
130:            }
131:
132:            /**
133:             * Returns a boolean indicating whether it will an output file for a job
134:             * to do the transfers. Transfer reads from stdin the file transfers that
135:             * it needs to do.
136:             *
137:             * @param job the job being detected.
138:             *
139:             * @return false
140:             */
141:            public boolean needsSLSOutput(SubInfo job) {
142:                return false;
143:            }
144:
145:            /**
146:             * Returns the LFN of sls input file.
147:             *
148:             * @param job SubInfo
149:             *
150:             * @return the name of the sls input file.
151:             */
152:            public String getSLSInputLFN(SubInfo job) {
153:                return null;
154:            }
155:
156:            /**
157:             * Returns the LFN of sls output file.
158:             *
159:             * @param job SubInfo
160:             *
161:             * @return the name of the sls input file.
162:             */
163:            public String getSLSOutputLFN(SubInfo job) {
164:                return null;
165:            }
166:
167:            /**
168:             * Generates a second level staging file of the input files to the worker
169:             * node directory.
170:             *
171:             * @param job           job for which the file is being created
172:             * @param fileName      name of the file that needs to be written out.
173:             * @param submitDir     submit directory where it has to be written out.
174:             * @param headNodeDirectory    directory on the head node of the compute site.
175:             * @param workerNodeDirectory  worker node directory
176:             *
177:             * @return null as no SLS file is generated.
178:             */
179:            public File generateSLSInputFile(SubInfo job, String fileName,
180:                    String submitDir, String headNodeDirectory,
181:                    String workerNodeDirectory) {
182:
183:                return null;
184:            }
185:
186:            /**
187:             * Generates a second level staging file of the input files to the worker
188:             * node directory.
189:             *
190:             * @param job the job for which the file is being created
191:             * @param fileName the name of the file that needs to be written out.
192:             * @param submitDir the submit directory where it has to be written out.
193:             * @param headNodeDirectory the directory on the head node of the
194:             *   compute site.
195:             * @param workerNodeDirectory the worker node directory
196:             *
197:             * @return null as no SLS file is generated.
198:             *
199:             */
200:            public File generateSLSOutputFile(SubInfo job, String fileName,
201:                    String submitDir, String headNodeDirectory,
202:                    String workerNodeDirectory) {
203:
204:                return null;
205:
206:            }
207:
208:            /**
209:             * Modifies a job for the first level staging to headnode.This is to add
210:             * any files that needs to be staged to the head node for a job specific
211:             * to the SLS implementation. If any file needs to be added, a <code>FileTransfer</code>
212:             * object should be created and added as an input or an output file.
213:             *
214:             *
215:             * @param job           the job
216:             * @param submitDir     the submit directory
217:             * @param slsInputLFN   the sls input file if required, that is used for
218:             *                      staging in from the head node to worker node directory.
219:             * @param slsOutputLFN  the sls output file if required, that is used
220:             *                      for staging in from the head node to worker node directory.
221:             * @return boolean
222:             */
223:            public boolean modifyJobForFirstLevelStaging(SubInfo job,
224:                    String submitDir, String slsInputLFN, String slsOutputLFN) {
225:
226:                return true;
227:
228:            }
229:
230:            /**
231:             * Modifies a compute job for second level staging. Adds the appropriate
232:             * condor classads. It assumes that all the files are being moved to and from
233:             * the submit directory directly. Ignores any headnode parameters passed.
234:             *
235:             *
236:             * @param job the job to be modified.
237:             * @param headNodeURLPrefix the url prefix for the server on the headnode
238:             * @param headNodeDirectory the directory on the headnode, where the
239:             *   input data is read from and the output data written out.
240:             * @param workerNodeDirectory the directory in the worker node tmp
241:             *
242:             * @return boolean indicating whether job was successfully modified or
243:             *   not.
244:             *
245:             */
246:            public boolean modifyJobForWorkerNodeExecution(SubInfo job,
247:                    String headNodeURLPrefix, String headNodeDirectory,
248:                    String workerNodeDirectory) {
249:
250:                //sanity check on style of the job
251:                //handle the -w option that asks kickstart to change
252:                //directory before launching an executable.
253:                String style = (String) job.vdsNS.get(VDS.STYLE_KEY);
254:                if (style == null
255:                        || !(style.equals(VDS.CONDOR_STYLE) || style
256:                                .equals(VDS.GLIDEIN_STYLE))) {
257:                    mLogger.log("Invalid style for the job " + job.getName(),
258:                            LogManager.ERROR_MESSAGE_LEVEL);
259:                    return false;
260:                }
261:
262:                //remove any directory. let condor figure it out
263:                job.condorVariables.removeKey("initialdir");
264:
265:                //set the initial dir to the headnode directory
266:                job.condorVariables.construct("initialdir", headNodeDirectory);
267:
268:                //iterate through all the input files
269:                for (Iterator it = job.getInputFiles().iterator(); it.hasNext();) {
270:                    PegasusFile pf = (PegasusFile) it.next();
271:
272:                    //ignore any input files of FileTransfer as they are first level
273:                    //staging put in by Condor Transfer refiner
274:                    if (pf instanceof  FileTransfer) {
275:                        continue;
276:                    }
277:
278:                    String lfn = pf.getLFN();
279:                    //add an input file for transfer
280:                    //job.condorVariables.addIPFileForTransfer( headNodeDirectory + File.separator + lfn );
281:                    //we add just the lfn as we are setting initialdir
282:                    job.condorVariables.addIPFileForTransfer(lfn);
283:                }
284:
285:                //iterate and add output files for transfer back
286:                for (Iterator it = job.getOutputFiles().iterator(); it
287:                        .hasNext();) {
288:                    PegasusFile pf = (PegasusFile) it.next();
289:                    String lfn = pf.getLFN();
290:
291:                    //ignore any input files of FileTransfer as they are first level
292:                    //staging put in by Condor Transfer refiner
293:                    if (pf instanceof  FileTransfer) {
294:                        continue;
295:                    }
296:
297:                    //add an input file for transfer
298:                    job.condorVariables.addOPFileForTransfer(lfn);
299:                }
300:
301:                return true;
302:
303:            }
304:
305:        }
www.java2java.com | Contact Us
Copyright 2009 - 12 Demo Source and Support. All rights reserved.
All other trademarks are property of their respective owners.