19

Let's imagine I have a dict :

d = {'a': 3, 'b':4}

I want to create a function f that does the exact same thing than this function :

def f(x, a=d['a'], b=d['b']):
  print(x, a, b)

(Not necessarily print, but do some stuff with the variable and calling directly from their name).

But I would like to create this function directly from the dict, that is to say, I would like to have something that look likes

def f(x, **d=d):
  print(x, a, b)

and that behaves like the previously defined function. The idea is that I have a large dictionary that contains defaults values for arguments of my function, and I would like not to have to do

def f(a= d['a'], b = d['b'] ...)

I don't know if it's possible at all in python. Any insight is appreciated !

Edit : The idea is to be able to call f(5, a=3). Edit2 : The question is not about passing arguments stored in a dict to a function but to define a function whose arguments names and defaults values are stored in a dict.

13
  • 2
    You can create a function that takes only the dictionary, and then parse the dictionary inside the function Commented Aug 21, 2019 at 13:08
  • 6
    why to not pass entire dict? what is the reason to search for such as tricks? Commented Aug 21, 2019 at 13:09
  • 2
    The idea is that I want to be able to call f(3, a=5) and not have to pass it a dictionnary. Commented Aug 21, 2019 at 13:14
  • 2
    There isn't a "clean" way to do this. To create local variables inside a function (which is what keyword parameters are, effectively), you may have to resort to exec, as shown in this answer to a related question. Commented Aug 21, 2019 at 13:44
  • 5
    While I found this to be an interesting question, I think it's probably both safer and more idiomatic to rework your thinking and your design so that you just use explicit dictionaries (or other objects) instead of the function's local namespace. Commented Aug 21, 2019 at 14:03

8 Answers 8

9

You cannot achieve this at function definition because Python determines the scope of a function statically. Although, it is possible to write a decorator to add in default keyword arguments.

from functools import wraps

def kwargs_decorator(dict_kwargs):
    def wrapper(f):
        @wraps(f)
        def inner_wrapper(*args, **kwargs):
            new_kwargs = {**dict_kwargs, **kwargs}
            return f(*args, **new_kwargs)
        return inner_wrapper
    return wrapper

Usage

@kwargs_decorator({'bar': 1})
def foo(**kwargs):
    print(kwargs['bar'])

foo() # prints 1

Or alternatively if you know the variable names but not their default values...

@kwargs_decorator({'bar': 1})
def foo(bar):
    print(bar)

foo() # prints 1

Caveat

The above can be used, by example, to dynamically generate multiple functions with different default arguments. Although, if the parameters you want to pass are the same for every function, it would be simpler and more idiomatic to simply pass in a dict of parameters.

Sign up to request clarification or add additional context in comments.

12 Comments

This is cool, but I still have to do kwargs['bar'] in the code of foo, I would like to call print(bar) in the code of foo.
So you want to set an argument? Not kwarg? I don't think this is possible since the local scope is built at function definition
Once again, the idea is that I have a lots of variable and I don't want to parse all of them manually.
@StatisticDean, you want to access to function arguments using keyword and doesn't write this keywords in declaration?
Just to be clear, if you want separate local variables, you still have to include all the keywords in the function signature. That is, every key that is in the default-value dictionary has to appear in the function signature. So if the dictionary has a dozen keys but your function only really needs three of them, you still have to list all 12 in the function signature.
|
4

Python is designed such that the local variables of any function can be determined unambiguously by looking at the source code of the function. So your proposed syntax

def f(x, **d=d):
  print(x, a, b)

is a nonstarter because there's nothing that indicates whether a and b are local to f or not; it depends on the runtime value of the dictionary, whose value could change across runs.

If you can resign yourself to explicitly listing the names of all of your parameters, you can automatically set their default values at runtime; this has already been well covered in other answers. Listing the parameter names is probably good documentation anyway.

If you really want to synthesize the whole parameter list at run time from the contents of d, you would have to build a string representation of the function definition and pass it to exec. This is how collections.namedtuple works, for example.

Variables in module and class scopes are looked up dynamically, so this is technically valid:

def f(x, **kwargs):
    class C:
        vars().update(kwargs)  # don't do this, please
        print(x, a, b)

But please don't do it except in an IOPCC entry.

1 Comment

The documentation isn't even completely clear that this is "technically valid". I tried it on my computer, and it happens to work for my version of Python, on my machine. But if you look at the documentation for vars(), locals(), and __dict__, you get the distinct impression that this falls into what is commonly referred to by C folks as "undefined behavior".
4

try this:

# Store the default values in a dictionary
>>> defaults = {
...     'a': 1,
...     'b': 2,
... }
>>> def f(x, **kwa):
        # Each time the function is called, merge the default values and the provided arguments
        # For python >= 3.5:
        args = {**defaults, **kwa}
        # For python < 3.5:
        # Each time the function is called, copy the default values
        args = defaults.copy()
        # Merge the provided arguments into the copied default values
        args.update(kwa)
...     print(args)
... 
>>> f(1, f=2)
{'a': 1, 'b': 2, 'f': 2}
>>> f(1, f=2, b=8)
{'a': 1, 'b': 8, 'f': 2}
>>> f(5, a=3)
{'a': 3, 'b': 2}

Thanks Olvin Roght for pointing out how to nicely merge dictionaries in python >= 3.5

8 Comments

what is the point of using defaults(external variable) inside a function ? this limit the usability of the function
@prashantrana - Because that is what the OP is asking for, basically.
not a good way to do this, as , you can use just globals().update(kwa) it doing same work
@prashantrana - No, it's not doing the same thing. OP isn't interested in updating global variables.
You don't need so much code to just merge two dicts: return {**defaults, **kwa}
|
1

How about the **kwargs trick?

def function(arg0, **kwargs):
    print("arg is", arg0, "a is", kwargs["a"], "b is", kwargs["b"])

d = {"a":1, "b":2}
function(0., **d)

outcome:

arg is 0.0 a is 1 b is 2

2 Comments

The thing is I want my function to have exactly the keys of d as arguments, not any keywords, and if my d is large, I have to write a ton of code to get the desired behavior, which is counterproductive, (It's faster to write f(x, a =d['a'], b= d['b']) than this.)
Right, I see. Sorry, I misunderstood your question.
1

This question is very interesting, and it seemed different people have their different own guess about what the question really want.

I have my own too. Here is my code, which can express myself:

# python3 only
from collections import defaultdict

# only set once when function definition is executed
def kwdefault_decorator(default_dict):
    def wrapper(f):
        f.__kwdefaults__ = {}
        f_code = f.__code__
        po_arg_count = f_code.co_argcount
        keys = f_code.co_varnames[po_arg_count : po_arg_count + f_code.co_kwonlyargcount]
        for k in keys:
            f.__kwdefaults__[k] = default_dict[k]
        return f

    return wrapper

default_dict = defaultdict(lambda: "default_value")
default_dict["a"] = "a"
default_dict["m"] = "m"

@kwdefault_decorator(default_dict)
def foo(x, *, a, b):
    foo_local = "foo"
    print(x, a, b, foo_local)

@kwdefault_decorator(default_dict)
def bar(x, *, m, n):
    bar_local = "bar"
    print(x, m, n, bar_local)

foo(1)
bar(1)
# only kw_arg permitted
foo(1, a=100, b=100)
bar(1, m=100, n=100)

output:

1 a default_value
1 m default_value
1 100 100
1 100 100

Comments

0

Posting this as an answer because it would be too long for a comment.

Be careful with this answer. If you try

@kwargs_decorator(a='a', b='b')
def f(x, a, b):
    print(f'x = {x}')
    print(f'a = {a}')
    print(f'b = {b}')

f(1, 2)

it will issue an error:

TypeError: f() got multiple values for argument 'a'

because you are defining a as a positional argument (equal to 2).

I implemented a workaround, even though I'm not sure if this is the best solution:

def default_kwargs(**default):
    from functools import wraps
    def decorator(f):
        @wraps(f)
        def wrapper(*args, **kwargs):
            from inspect import getfullargspec
            f_args = getfullargspec(f)[0]
            used_args = f_args[:len(args)]

            final_kwargs = {
                key: value 
                for key, value in {**default, **kwargs}.items() 
                if key not in used_args
            }

            return f(*args, **final_kwargs)
        return wrapper
    return decorator

In this solution, f_args is a list containing the names of all named positional arguments of f. Then used_args is the list of all parameters that have effectively been passed as positional arguments. Therefore final_kwargs is defined almost exactly like before, except that it checks if the argument (in the case above, a) was already passed as a positional argument.

For instance, this solution works beautifully with functions such as the following.

@default_kwargs(a='a', b='b', d='d')
def f(x, a, b, *args, c='c', d='not d', **kwargs):
    print(f'x = {x}')
    print(f'a = {a}')
    print(f'b = {b}')
    for idx, arg in enumerate(args):
        print(f'arg{idx} = {arg}')
    print(f'c = {c}')
    for key, value in kwargs.items():
        print(f'{key} = {value}')

f(1)
f(1, 2)
f(1, b=3)
f(1, 2, 3, 4)
f(1, 2, 3, 4, 5, c=6, g=7)

Note also that the default values passed in default_kwargs have higher precedence than the ones defined in f. For example, the default value for d in this case is actually 'd' (defined in default_kwargs), and not 'not d' (defined in f).

Comments

-1

You can unpack values of dict:

from collections import OrderedDict

def f(x, a, b):
    print(x, a, b)

d = OrderedDict({'a': 3, 'b':4})
f(10, *d.values())

UPD.

Yes, it's possible to implement this mad idea of modifying local scope by creating decorator which will return class with overriden __call__() and store your defaults in class scope, BUT IT'S MASSIVE OVERKILL.

Your problem is that you're trying to hide problems of your architecture behind those tricks. If you store your default values in dict, then access to them by key. If you want to use keywords - define class.

P.S. I still don't understand why this question collect so much upvotes.

3 Comments

minus for ... what?
I didn't downvote, but I don't see how this answers the question, since f doesn't have any defaults.
@wjandrea, It took some time to understand what author wants (not only for me).
-1

Sure... hope this helps

def funcc(x, **kwargs):
    locals().update(kwargs)
    print(x, a, b, c, d)

kwargs = {'a' : 1, 'b' : 2, 'c':1, 'd': 1}
x = 1

funcc(x, **kwargs)

3 Comments

It's really unsafe to update globals with that amount of variables. Beg chance to shadow something important.
This leaks variables to the outer scope too. Avoid.
Using locals instead of globals will be better.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.