User Activity

  • Modified a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed the slurm Rocks roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configured as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configured them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and I...

  • Modified a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed the slurm Rocks roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configured as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configured them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and I...

  • Modified a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed the slurm Rocks roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configured as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configuresd them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and...

  • Modified a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed the slurm Rocks roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configures as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configuresd them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and...

  • Modified a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed the slurm Rocls roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configures as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configuresd them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and...

  • Posted a comment on discussion General Discussion on slurm-roll

    I recently brought up a new head node and installed teh slurm Rocls roll (release-7.0.0-19.05.02). Things were working, but node configs seemed out of sync. The nodes are dual AMD EPYC 7451 24-Core, with SMT on. Originally they were configures as 96 CPU. I ran rocks report slurm_hwinfo| sh rocks sync slurm This configuresd them as 8 socket, 6 core, 2 thread. epyc-compute-1-1 1 CLUSTER drained 96 8:6:2 128587 197999 205757 rack-1,9 Low socketcore coun They all show up as Low socket*core count and...

View All

Personal Data

Username:
munorc
Joined:
2019-11-12 20:07:21

Projects

  • No projects to display.

Personal Tools