Download this example as a Jupyter notebook
or a
Python script
.
Schedule and modify jobs#
This example shows how to create a job request, schedule it for execution in the future, and modify the scheduled job before it completes.
This example uses an Excel import job, but the functionality demonstrated here can be applied to text import and Excel export jobs.
Connect to Granta MI#
Import the Connection
class and create the connection. For more information, see the Connect and access the job queue example.
[1]:
from ansys.grantami.jobqueue import Connection
server_url = "http://my_grantami_server/mi_servicelayer"
client = Connection(server_url).with_credentials("user_name", "password").connect()
Create an ExcelImportJobRequest
object#
This cell creates an Excel import job request and schedules it for execution tomorrow. This example does not contain a full description of the ExcelImportJobRequest
object. For more information, see the Create an Excel import job example.
[2]:
import datetime
import pathlib
try:
# Python 3.11+
from datetime import UTC as utc
except ImportError:
# Python 3.9 and 3.10
from datetime import timezone
utc = timezone.utc
from ansys.grantami.jobqueue import ExcelImportJobRequest
tomorrow = datetime.datetime.now(utc) + datetime.timedelta(days=1)
combined_excel_import_request = ExcelImportJobRequest(
name="Excel Import (combined template and data file)",
description="An example excel import job",
combined_files=[pathlib.Path("assets/combined_import_file.xlsx")],
scheduled_execution_date=tomorrow,
)
combined_excel_import_request
[2]:
<ExcelImportJobRequest: name: "Excel Import (combined template and data file)">
Submit the job#
Next, submit the jobs to the server. There are two methods for submitting job requests:
create_job()
: Submit the job request to the server and immediately return anAsyncJob
object in the pending state.create_job_and_wait()
: Submit the job request to the server and block until the job either completes or fails. Return anAsyncJob
object in the succeeded or failed state.
Because you have configured the Excel job request object to execute tomorrow, you must use the create_job()
method. If you used the create_job_and_wait()
method, it would block until the job completed, which means this script would take 24 hours to complete.
[3]:
deferred_job = client.create_job(combined_excel_import_request)
deferred_job
[3]:
<AsyncJob: name: "Excel Import (combined template and data file)", status: "JobStatus.Pending">
List jobs#
Use the .jobs
property to list the jobs on the server.
[4]:
client.jobs
[4]:
[<AsyncJob: name: "Excel Import (combined template and data file)", status: "JobStatus.Pending">,
<AsyncJob: name: "Excel Export", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Text Import", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Excel Import (separate template and data files)", status: "JobStatus.Succeeded">]
Note that only jobs that you have access to are included in this property. Non-administrator users can only access and modify their own jobs. Administrator users can access and modify all jobs on the server.
Edit existing jobs#
You can edit the properties of a running or completed job with the associated AsyncJob
object. The following cell shows how to update the name and description of the job and to change the scheduled execution to occur immediately.
[5]:
deferred_job.update_name("Combined Excel Import (modified)")
deferred_job.update_description("A new description for a combined Excel import job")
now = datetime.datetime.now(utc)
deferred_job.update_scheduled_execution_date_time(now)
client.jobs
[5]:
[<AsyncJob: name: "Combined Excel Import (modified)", status: "JobStatus.Pending">,
<AsyncJob: name: "Excel Export", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Text Import", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Excel Import (separate template and data files)", status: "JobStatus.Succeeded">]
Retrieve long-running jobs#
If the job is expected to take a long time to complete, you can save the job ID to disk and use it with the client.get_job_by_id()
method to check the status of the job later.
[6]:
job_id = deferred_job.id
retrieved_job = client.get_job_by_id(job_id)
retrieved_job
[6]:
<AsyncJob: name: "Combined Excel Import (modified)", status: "JobStatus.Pending">
Wait for the pending job to complete.
[7]:
import time
from ansys.grantami.jobqueue import JobStatus
while deferred_job.status not in [JobStatus.Succeeded, JobStatus.Failed]:
time.sleep(1)
deferred_job.update()
deferred_job.status
[7]:
<JobStatus.Succeeded: 'Succeeded'>
Access output files#
The job is now complete. You can access the files generated by the job in the same way as for jobs that execute immediately.
[8]:
log_file_name = next(name for name in deferred_job.output_file_names if "log" in name)
log_file_content = deferred_job.get_file_content(log_file_name)
log_file_string = log_file_content.decode("utf-8")
print(f"{log_file_name} (first 200 characters):")
print(f"{log_file_string[:500]}...")
Combined Excel Import (modified).log (first 200 characters):
2024-10-03 07:03:32,747 [26] INFO Task started: Template:'combined_import_file', Step:'Pre-Import Validation: Problematic Values & Functions', Datafile:'combined_import_file', Culture:'en-US'
2024-10-03 07:03:32,747 [26] INFO Task finished.
2024-10-03 07:03:32,763 [26] INFO Task started: Template:'C:\Windows\SystemTemp\excelImportTemp\cfm4upun.1cz\assets/combined_import_file.xlsx', Step:'Default Template', Datafile:'C:\Windows\SystemTemp\excelImportTemp\cfm4upun.1cz\assets/combined_import_...
Delete a job#
You can delete jobs from the job queue using the .delete_jobs()
method. This method accepts a list of jobs, which means that you can delete multiple jobs with a single request if required.
[9]:
client.delete_jobs([deferred_job])
client.jobs
[9]:
[<AsyncJob: name: "Excel Export", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Text Import", status: "JobStatus.Succeeded">,
<AsyncJob: name: "Excel Import (separate template and data files)", status: "JobStatus.Succeeded">]