Is there a way to declare a variable as unused in PyCharm, or in Python in general, so you can explicitly tell the compiler not to give a warning about it?

I am not talking about the convention of naming unused variables for the programmer (often named “_” or “__”), but an option to explicitly mark a variable as unused for the compiler, for example in a loop. I also don’t just want to disable inspections in general.

I’ve heard that you can do this in PyDev by beginning the variable name with “unused”, and I thought this might exist in PyCharm as well, but couldn’t find it yet.

You can disable this inspection either for a single statement like:

# noinspection PyUnusedLocal
unused_thing = something()

or for a whole function (or class) by placing the comment above the function (or class):

# noinspection PyUnusedLocal
def foo():
    unused_thing = something()

For some reason this particular inspection cannot be switched off via the inspections context menu… maybe worth a pycharm ticket.

You can easily and least intrusively ignore pycharm unused local warnings (only) for unused function parameters by prefixing them with underscores.

E.g.

In the following code, pycharm will not warn about the unused parameter _bar

def foo(_bar):
    print("nothing here")

I’ve noticed that using a single underscore for the throwaway variable name seems to bypass this check. I’m using PyCharm 2016.1.3.

for _ in range(3):
    pass

Another way, similar to UNUSED in C++ (here), which works if you want to hide the warning on a specific function parameter but keeps the warning enabled for the rest of the function:

# noinspection PyUnusedLocal
def UNUSED(*args, **kwargs):
    pass

def my_function(alpha, beta, gamma):
    UNUSED(gamma)
    return alpha + beta

Just to extend to sebastian’s answer, if you use a decorator with the function. You need to place the

# noinspection PyUnusedLocal

above the decorator if you place it between the decorator and the function name it will not work.

# noinspection PyUnusedLocal
@torch.no_grad()
def step(self, closure=None):
    """Performs a single optimization step.

    Arguments:
        closure (callable, optional): A closure that reevaluates the model
            and returns the loss.
    """
    pass