Neon shutdown
The Neon HPC cluster will be shutdown on January 7, 2019. Nodes that still work on Neon will be carried over and implemented into Argon as of January 21, 2019. This means that the data from your Neon home/scratch will have to be copied and stored either on to your existing Argon storage or stored locally. Below you can find a timeline of the transition.
- May 1, 2018 - HPC Model Change Decision Made & Consumer GPU Nodes Announced
- July/August 2018 - First Phase 2 Argon GPU Nodes Installed
- November 1, 2018 - No New Neon Accounts Created
- January 7, 2019 - Neon System Shutdown
- January 21, 2019 - Neon nodes reinstalled and accessible in Argon.
- March 1, 2019 - Last day to transfer data off Neon /home and /nfsscratch
For further information regarding the transition, please click here.
Your storage:
Your data on Neon will only be stored until March 1, 2019. Please be advised that the data being copied from Neon home/scratch to a local machine will have to be decompressed, and may end up being larger than what it showed on on Neon. When copying data from Neon → Argon homes, your storage for your Argon home cannot exceed 1TB of data. Reminder that the inability to submit jobs occurs once the storage quota exceeds 90%. To check your current storage, use the command below:
If you receive a message stating that your home account has reached its quota, and you cannot fill it any farther, you must remove some large files from your home directory. Sometimes, the rm command will fail with a message of "Quota has been reached." If this happens, here is what you can do to correct the issue.
- Pick a large, unwanted file to remove.
Copy the contents of /dev/null to this file. This will basically make the file a pointer to NULL values with a size of 0.
cp /dev/null name_of_file_to_delete
Once you have done this, you can use 'rm' as you normally would to free up more space.
When copying large amounts of data from Neon to Argon-nfsscratch, remember that data that is placed there will be cleared 60 days later as part of our cleaning policy. The scratch cleaning policy on neon-nfsscratch will be removed on January 7th so that no data will be removed by that policy prior to the host being decommissioned on March 7.
Methods to obtain data:
The login to Neon will be going away on January 7th so it is advised against using relying on mounting your home account to transfer storage. The preferred method will be using data.hpc.uiowa.edu.
Fetch, WinSCP and GridFTP will all use this server address and is documented as such.
All pieces of software will connect to the server with data.hpc.uiowa.edu. Refer to the table below to choose the address that pertains to you.
/hpchomes/<cluster>/<hawkid>/ | Your HPC home directory. For <cluster>, this is either "neon" or "argon" |
/Dedicated/<sharename>/ | Dedicated shares |
/Shared/<sharename>/ | Shared shares |
/nfsscratch/<cluster>/ | The NFS scratch filesystem. For <cluster>, this is either "neon" or "argon" |
Fetch (mac only): This will mount your Neon account. Follow the table above for setting a path to get your data from shared shares, dedicated shares, or from nfsscratch. In this case, the path below is just for home directories. You can then execute "get" commands to pull your data from Neon onto your local workstation (keep in mind the compression aspect).
Note: This method will require a Duo authentication.
WinSCP (windows only): This will mount your Neon account. Follow the table above for setting a path to get your data from shared shares, dedicated shares, or from nfsscratch. In this case, the path below is just for home directories. The server will just be data.hpc.uiowa.edu - we have to manually link it to the proper directory. Once connected, you can just drag and drop files from Neon onto your local workstation.
After you authenticate with Duo, you will notice your path looks like this. Click the very first character, the forward slash, twice. It will open a "Open directory window" where you then can then enter the path from the above table for dedicated, shared, or nfsscratch.
Globus/GridFTP:
Please follow this link: Globus Online