Research Computing provides hardware and software services to the UF research community. Our efforts are focused on activities that depend on large-scale computing activities such as high-performance computing, high-throughput computing, and large data sets known as “big data.”


HiPerGator Changes Fall 2015 and Spring 2016

The first phase of the upgrade to HiPerGator was completed in November 2015 and marked by the Celebration of UF Research and HiPerGator 2.0 Day on December 1, 2015. This phase consisted of installation and thorough testing of the 30,000 core addition referred to as HiPerGator 2.0.

The second phase is ongoing and will consist of several upgrades and changes to HiPerGator operations and services. To avoid disruption of service to researchers there will be a process to migrate users from the old to the new part of the system. The old part of the system will then be upgraded and integrated into the new combined HiPerGator. This process will be completed by the end of May 2016.

The changes are described in detail here, but can be summarized as follows:

  1. Transition to GatorLink authentication to increase security in this age of cyber attacks
  2. Transition from the Moab job scheduler to the Slurm job scheduler
  3. Upgrade of the file system from Lustre 1.8 to Lustre 2.5 to unify and simplify the short-term scratch, long-term storage, and replicated storage services

User training sessions will be scheduled in January 2016 to assist users in the transition. Detailed schedules will be available in early January.

Broadly speaking, the services fall into the following categories:


  • Comprehensive support
  • Professional hardware maintenance
  • Software installation and support
  • Timely software patches and updates
  • 24×7 system monitoring
  • Access to interactive hosts for testing and development
  • Ability to use more cores than purchased during periods of non-peak demand

Provided Resources

  • Large compute cluster for high-performance and high-throughput computing
  • Fast storage for large data sets
  • Network connectivity via the high-speed Campus Research Network
  • System software, compilers, clustering software, and many open source applications
  • Home storage space
  • A pool of local spare parts to minimize downtime
  • Machine room space, electrical power, and cooling