Bigger Together than Apart
Faculty applying for research grants may obtain bugetetary quotes for additonal software and hardware add-ons to BeoShock via this link: BeoShock HPC Budgetary Estimate.
HPC Alliance Program
Mechanism for Wichita State University units or individual faculty members to add compute resources to the HPC cluster to meet their HPC resource requirements.
Contributors purchase compatible HPC compute nodes, and WSU IT Services houses the nodes in a secure campus data center with appropriate power, environment, and networking. Contributor nodes share software licenses and storage infrastructure with the other HPC cluster nodes.
A dedicated queue is created for the contributor with access to the quantity of compute resources that the partner added to the cluster for a period of 3 years. Contributor and any other WSU IDs the partner specifies have access to the dedicated queue. Queue parameters are set based on what the partner needs.
In addition to a dedicated queue, the alliance project has increased priority in all the queues using a fair-share scheduling methodology based upon the amount contributed.
Compute resources not being actively used by the contributor are made available to other WSU HPC users.
The HPC compute node Alliance Program offers compelling advantages for both the faculty contributor and for the university.
Advantages (services provided by university)
- secure physical space
- redundant power (including UPS and diesel generator)
- environment controlled
- rack (including rack power distribution)
- network infrastructure (including message passing network for distributed memory nodes)
- system administration and maintenance
- fairly distributed priority access to compute resources
- priority access to purchased hardware
- access to shared storage and file systems
- access to a university-funded licensed software*
- system and computational science support from HPC support team
- multiplies resources provided by university HPC investment
- increased HPC resource utilization yields more efficient use of university-wide research computing dollars
- scaling benefits reduce university-wide cost of HPC facilities (a few large power and cooling units vs. many small power and cooling units)
- scaling benefits reduce university-wide cost of HPC system support (incremental system administration and maintenance load for compatible hardware is very small - that is it takes nearly the same work to operate an 8-processor cluster as it does to operate a 100-processor cluster)
A Huge Thanks to our Investors
Divisions, Colleges, and Departments donating dollars and time include: Office of Research, College of Engineering, Fairmount College of Liberal Arts and Sciences, and W. Frank Barton School of Business, Academic Affairs, The University Libraries and ITS for volunteering extra time and support!
KanREN, Inc. is a 501(c)(3) non-profit charitable organization providing research connectivity and is committed to making CI resources available to all public higher education institutions in the state of Kansas.
The Great Plains Network (GPN) is a non-profit consortium aggregating networks through GigaPoP connections while advocating research on behalf of universities and community innovators across the Midwest and Great Plains who seek collaboration, cyberinfrastructure and support for big data and big ideas, at the speed of the modern Internet.