A | B | C | D | E | F | G | H | I | J | K | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Summary needs, goals, capabilities, and opportunities | |||||||||||
2 | Institution / Category | Sharable Hardware Resources | Trainings | Student Opportunities | Collaboration Directions | Proposal Directions | Willing to Host Symposium or Meetups | Contact (on RMACC Slack or email) | What would you like RMACC to provide your institution? | Additional Comments | ||
3 | CU Boulder | 1. CUmulus 2. Alpine 3. Summit | 1. Centralized resources | Yes to both, hosted most Symposia | ||||||||
4 | University of Arizona | |||||||||||
5 | Arizona State University | Currently contributing to Jetstream2 and OSG | Host ~60 workshops annually including motivational supercomputing workshop; happy to share materials. Some discussion in User Facillitator meetings. | ACCESS affinity groups, grants, outreach and engagement, REUs | Yes to both, hosted 2023 Symposium and 2019 Sysadmin meetup. | Jason <yalim@asu.edu> | Expertise, practice standards, wider outreach channels, student opportunities, staff trainings | We have two certified CyberAmbassador trainers on our team and 4-5 certified Carpentry instructors | ||||
6 | Idaho National Laboratory | None at this time, unless you are collaborating with an INL researcher or working on a nuclear energy project | Training videos and recourses are not generally accessible outside INL, but we are open to getting them cleared to share | INL hosts hundreds of intenships annually, including dozens in HPC related fields. | sysadmin meetup | Expertise, practice standards, wider outreach channels, staff trainings | setting up a cross institutional mentoring/internship program may be interesting - at least something to discuss. | |||||
7 | CU Denver | We could switch satisfying our obligation of outside use from OSG to RMACC. Just consolidating the system setup for that would be very beneficial. | CU Denver becomes eligible for CC* Compute 5 years after the last award, in mid-2025. More likely for a contribution to RMACC Alpine or successor rather than a local system. Maybe a storage proposal now? | Jan.Mandel@ucdenver.edu | ||||||||
8 | Northern Arizona University | Willing to share training/workshop materials. Have workshops: intro to monsoon, indepth monsoon, globus, linux cmdline | Willing to host both | chris.coffey@nau.edu | Expertise, practice standards, wider outreach channels, student opportunities, staff trainings | |||||||
9 | Montana State University | Tempest HPC (https://www.montana.edu/uit/rci/tempest/) gaining OSG integration, futre proposals could open direct regional access. | Recorded HPC training, 1:1 Q&A sessions with internal users | Resoruces avaialble to MSU courses, RCI group has 4 student positions | coltran@montana.edu | |||||||
10 | New Mexico Institute of Mining & Technology | None currently | None currently | None currently | Data storage and sys admin support | Data storage and sys admin support | yes | joel.sharbrough@nmt.edu, john.naliboff@nmt.edu | Access to Alpine supercomputing, VASP, cheaper cloud storage, Sys admin support/training | |||
11 | University of Utah | Nothing currently defined, but if need we are open to discussions | We offer CHPC focused trainings each semester. Have about 25 or so different sessions. Open to all. No account needed. See https://chpc.utah.edu/presentations/index.php for details. Willing to share materials. | We do have campus level student program, again if interest from students at other RMACC please reach out. We can also share details on our student program run through central (www.sudo.utah.edu) | no specifics, but there is ongoing interest with exploring possible collaborations in the region | No specifics, but we are always open to discuss collaborative proposals | yes | anita.orendt@utah.edu, sam.liston@utah.edu, brian.haymore@utah.edu | Continue to facilitate institutions being able to explore collaborative opportunities and share best practices. Continue to facilitate things such as monthly user services call, different special nterests groups, sys admin meetup, etc. Continue to look for ways to engage new, emerging institutions in region as well as reaching out to other institutions in the region that have not yet been involved with RMACC. | setting up a cross institutional mentoring program may be interesting - at least something to discuss. | ||
12 | University of New Mexico | Nothing currently defined, but if need we are open to discussions. Particularly about how export controlled and patient data is (or is not) handled. | We offer in person workshops and online video tutorials: https://www.youtube.com/playlist?list=PLvr5gRBLi7VAzEB_t5aXOLHLfdIu2s1hZ | I hire 2-3 student assistants per semester. In the past they have been graduate students, this year we are trying undergraduates. | We would like to collaborate as much as possible. | |||||||
13 | Colorado School of Mines | None currently | We offer a "New HPC User Bootcamp", intermittently once a semester. Looking to expand to more HPC-related topics and higher frequency trainings in the future. Run by Nicholas Danes (Computational Scientist/HPC Facilitator) | None currently | Nothing specific,but curious on collaborating on standardizing training/educational knowledge between RMACC institutions | No specifics, but open to discuss propsoals | yes | ndanes@mines.edu, kirawells@mines.edu, rgilmore@mines.edu | Alpine supercomputing, meet-ups, standardization | |||
14 | USGS | |||||||||||
15 | NCAR/UCAR | |||||||||||
16 | ||||||||||||
17 | ||||||||||||
18 | ||||||||||||
19 | ||||||||||||
20 | ||||||||||||
21 | ||||||||||||
22 | ||||||||||||
23 | ||||||||||||
24 | ||||||||||||
25 | ||||||||||||
26 | ||||||||||||
27 | ||||||||||||
28 | ||||||||||||
29 | ||||||||||||
30 | ||||||||||||
31 | ||||||||||||
32 | ||||||||||||
33 | ||||||||||||
34 | ||||||||||||
35 | ||||||||||||
36 | ||||||||||||
37 | ||||||||||||
38 | ||||||||||||
39 | ||||||||||||
40 | ||||||||||||
41 | ||||||||||||
42 | ||||||||||||
43 | ||||||||||||
44 | ||||||||||||
45 | ||||||||||||
46 | ||||||||||||
47 | ||||||||||||
48 | ||||||||||||
49 | ||||||||||||
50 | ||||||||||||
51 | ||||||||||||
52 | ||||||||||||
53 | ||||||||||||
54 | ||||||||||||
55 | ||||||||||||
56 | ||||||||||||
57 | ||||||||||||
58 | ||||||||||||
59 | ||||||||||||
60 | ||||||||||||
61 | ||||||||||||
62 | ||||||||||||
63 | ||||||||||||
64 | ||||||||||||
65 | ||||||||||||
66 | ||||||||||||
67 | ||||||||||||
68 | ||||||||||||
69 | ||||||||||||
70 | ||||||||||||
71 | ||||||||||||
72 | ||||||||||||
73 | ||||||||||||
74 | ||||||||||||
75 | ||||||||||||
76 | ||||||||||||
77 | ||||||||||||
78 | ||||||||||||
79 | ||||||||||||
80 | ||||||||||||
81 | ||||||||||||
82 | ||||||||||||
83 | ||||||||||||
84 | ||||||||||||
85 | ||||||||||||
86 | ||||||||||||
87 | ||||||||||||
88 | ||||||||||||
89 | ||||||||||||
90 | ||||||||||||
91 | ||||||||||||
92 | ||||||||||||
93 | ||||||||||||
94 | ||||||||||||
95 | ||||||||||||
96 | ||||||||||||
97 | ||||||||||||
98 | ||||||||||||
99 | ||||||||||||
100 |