torch._logging#
Created On: Apr 24, 2023 | Last Updated On: Jun 17, 2025
PyTorch has a configurable logging system, where different components can be given different log level settings. For instance, one component’s log messages can be completely disabled, while another component’s log messages can be set to maximum verbosity.
Warning
This feature is in beta and may have compatibility breaking changes in the future.
Warning
This feature has not been expanded to control the log messages of all components in PyTorch yet.
There are two ways to configure the logging system: through the environment variable TORCH_LOGS
or the python API torch._logging.set_logs.
set_logs |
Sets the log level for individual components and toggles individual log artifact types. |
The environment variable TORCH_LOGS is a comma-separated list of
[+-]<component> pairs, where <component> is a component specified below. The + prefix
will decrease the log level of the component, displaying more log messages while the - prefix
will increase the log level of the component and display fewer log messages. The default setting
is the behavior when a component is not specified in TORCH_LOGS. In addition to components, there are
also artifacts. Artifacts are specific pieces of debug information associated with a component that are either displayed or not displayed,
so prefixing an artifact with + or - will be a no-op. Since they are associated with a component, enabling that component will typically also enable that artifact,
unless that artifact was specified to be off_by_default. This option is specified in _registrations.py for artifacts that are so spammy they should only be displayed when explicitly enabled.
The following components and artifacts are configurable through the TORCH_LOGS environment
variable (see torch._logging.set_logs for the python API):
- Components:
allSpecial component which configures the default log level of all components. Default:
logging.WARNdynamoThe log level for the TorchDynamo component. Default:
logging.WARNaotThe log level for the AOTAutograd component. Default:
logging.WARNinductorThe log level for the TorchInductor component. Default:
logging.WARNyour.custom.moduleThe log level for an arbitrary unregistered module. Provide the fully qualified name and the module will be enabled. Default:
logging.WARN
- Artifacts:
bytecodeWhether to emit the original and generated bytecode from TorchDynamo. Default:
Falseaot_graphsWhether to emit the graphs generated by AOTAutograd. Default:
Falseaot_joint_graphWhether to emit the joint forward-backward graph generated by AOTAutograd. Default:
Falsecompiled_autogradWhether to emit logs from compiled_autograd. Defaults:
Falseddp_graphsWhether to emit graphs generated by DDPOptimizer. Default:
FalsegraphWhether to emit the graph captured by TorchDynamo in tabular format. Default:
Falsegraph_codeWhether to emit the python source of the graph captured by TorchDynamo. Default:
Falsegraph_breaksWhether to emit a message when a unique graph break is encountered during TorchDynamo tracing. Default:
FalseguardsWhether to emit the guards generated by TorchDynamo for each compiled function. Default:
FalserecompilesWhether to emit a guard failure reason and message every time TorchDynamo recompiles a function. Default:
Falseoutput_codeWhether to emit the TorchInductor output code. Default:
FalsescheduleWhether to emit the TorchInductor schedule. Default:
False
- Examples:
TORCH_LOGS="+dynamo,aot"will set the log level of TorchDynamo tologging.DEBUGand AOT tologging.INFOTORCH_LOGS="-dynamo,+inductor"will set the log level of TorchDynamo tologging.ERRORand TorchInductor tologging.DEBUGTORCH_LOGS="aot_graphs"will enable theaot_graphsartifactTORCH_LOGS="+dynamo,schedule"will enable set the log level of TorchDynamo tologging.DEBUGand enable thescheduleartifactTORCH_LOGS="+some.random.module,schedule"will set the log level of some.random.module tologging.DEBUGand enable thescheduleartifact