Long-Running Tasks#

On a typical JupyterHub a user’s JupyterLab does not stop if the user logs out from the hub. Thus, it’s possible to let some Python code (neural network training for instance) run several days without having the notebook open in a webbrowser all the time. In this project we test the workflow for such long running tasks and discuss some caveats.

A Long-Running Task#

Task: Put the following Python code into a notebook on a JupyterHub:

import time

for i in range(60):
    print(f'iteration {i}')
    time.sleep(1)
    
print('finished')

Save the notebook, and let it run for 5 seconds. Then log out from the hub (without stopping the kernel), wait 5 seconds and log in again. Use a second cell to print a message. Wait until the message appears (may take up to 50 seconds). What do you learn from this experiment?

Solution:

# your answers

Simple Logging#

Task: Now do the same with the following code:

import time

with open('log.txt', 'w') as f:

    for i in range(60):
        print(f'iteration {i}')
        f.write(f'iteration {i}\n')
        time.sleep(1)
    
print('finished')

After execution finished, open the file log.txt. What do you see?

Solution:

# your answers

Capturing All Output#

Writing log files does not capture output from library code or error messages. Thus, we have to use another approach.

Task: Run the above test procedure with the following code:

%%capture cap

import time

for i in range(60):
    print(f'iteration {i}')
    time.sleep(1)
    
print('finished')

When finished, use cap.show() to see the captured output.