Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/63953
Closes #61138
Test:
`python -m torch.distributed.run --nproc_per_node 2 test.py`
Still outputs message
`LOGLEVEL=ERROR python -m torch.distributed.run --nproc_per_node 2 test.py`
Does not output message anymore
cc pietern mrshenli pritamdamania87 zhaojuanmao satgera rohan-varma gqchen aazzolini osalpekar jiayisuse agolynski SciPioneer H-Huang mrzzd cbalioglu gcramer23
Test Plan: Imported from OSS
Reviewed By: malfet
Differential Revision:
D30542997
Pulled By: H-Huang
fbshipit-source-id:
e7da30dcda51516abf4e56f1f510132e44397027
nproc_per_node = determine_local_world_size(args.nproc_per_node)
if "OMP_NUM_THREADS" not in os.environ and nproc_per_node > 1:
omp_num_threads = 1
- print(
+ log.warning(
f"*****************************************\n"
f"Setting OMP_NUM_THREADS environment variable for each process to be "
f"{omp_num_threads} in default, to avoid your system being overloaded, "