Search code examples
pythonfunctionerror-handlingchaining

can python function smart enough to show it's capability?


I want to define a function cluster can be chained and invoked in-a-go with the initial parameter given then get the final result, I want it may acting as Linux command pipe chain:

                     test01() | test02() | test03() | test04()
[init_parameter]  ---------------------------------------------------->  [final result]

I am in condsidering the functions sequence can be added/reduced/remixed, for example:

                     test02() | test03() | test01() | test04()
[init_parameter]  ---------------------------------------------------->  [final result]
                      test03()| test01() | test04() | test01() 
[init_parameter]  ---------------------------------------------------->  [final result]

I also want those functions may come with its ability indicator embedded which can be used for smart parameter pre-detected, for example, if input is not the type this function can accept or input exceed its max process ability, the chain can be just ignored on this calculation stream instead of using "try...except..." to catch those "logic" error.

See below code, untested, just for describing my idea:

def test01(f_inspect=false, a, **b):
    my_capability = {
        "function_cluster":          "A01", 

        "input_acceptable":          type([]),
        "input_element_acceptable":  type(1),
        "input_length_max":          100,

        "return_type":               type([]),
        "return_element":            type(1),
        "return_length_max":         100,
        }

    if f_inspect:
        return my_capability

    return [x+1 for x in a]           # just sample and may raise error by python such as div 0 


def test02(f_inspect=false, a, **b):
    # similiar as test01

def test03(f_inspect=false, a, **b):
    # similiar as test01

def test04(f_inspect=false, a, **b):
    # similar as test01
#==========================================



#test if function chain is compatible
def f_capability_compatible(current_,next_):
    return True if 
        next_["function_cluster"] == current_["function_cluster"] and
        next_["input_acceptable"] is current_["return_type"]  and
        next_["input_length_max"] >= current_["return_element"]  and
        next_["input_length_max"] >= current_["return_length_max"] 
    return False

foos = [test01,test02,test03,test04]

from itertools import permutations
mypermutations = permutations(foos, 3)    # get permutations of testXX

init_parameter = [x for x in range(1,100)]
dummy_function_parameter = {
        "function_cluster":          "A01", 
        "input_acceptable":          type([]),
        "input_element_acceptable":  type(1),
        "input_length_max":          100,
        "return_type":               type([]),
        "return_element":            type(1)
        "return_length_max":         100,
                          }
chain_flag = [True for x in range(len(mypermutations))]
#[True, True, True, ..... True, True, True, True, True, True, True]

for x in len(mypermutations):
    tmp_para = dummy_function_parameter
    for y in mypermutations[x]:
        test_result = f_capability_compatible(tmp_para,y(f_inspect=true))
        chain_flag[x] = test_result
        tmp_para = y(f_inspect=true)
        if test_result == False :
            print "this can not be chained due to parameter incompatible at position %s" %x 
            break

#==========================================
result_of_chain = []
# to invoke:
for x in len(mypermutations):
    if chain_flag[x] == True:
        try :
            # invoking my_capability[x] chain in a go
            tmp_return = init_parameter
            for f in mypermutations[x]:
                tmp_return = f(tmp_return)  #parameter of testXX(a)
        except :
            result_of_chain[x] = "Error"
    else:
        result_of_chain[x] = "Incomp"

Here is my question, It is possible to make this function chain and combination idea more simple?

=======================================================================================

update why I need to predicate the parameter and return type:

In linux command line, we can using command like this:

$ cat sometfile | grep something | awk '{print $0}' | grep something > file

this works because the data stream between those command can be think as the "text" type.

in python, however, for those unknown function, basically there's variety type possibility of the input parameter and return result. If I want to invoking those functions, I have to know the definition of it. For example

>>> def str_in_asciiout_out(str):
    return ord(str)
>>> 
>>> str_in_asciiout_out("a")
97
>>> str_in_asciiout_out(100)                 
# oops, I have to, try… expect…

Traceback (most recent call last):
  File "<pyshell#3>", line 1, in <module>
    str_in_asciiout_out(100)
  File "<pyshell#0>", line 2, in str_in_asciiout_out
    return ord(str)
TypeError: ord() expected string of length 1, but int found

Try... except... is right and proper way to code.

But if I want to combine hundreds of str_in_asciiout_out() alike functions and put them into an unknown sequence, what I focused is the best final result the sequence can delivered in short time.

For example, just example Suppose I have 1000 functions defined, each function may need to run one day to get the output by given input, I random picked 200 functions into a chain, and the str_in_asciiout_out(100) just on the last position by bad luck, I may got an oops until wasted 199 hours.

That’s why I want to know if the function can show it’s ability before the time-wasting invoking.

Above code is an ugly solution I known, so I paste the idea to see if there’s better solution for my problem.


Solution

  • I recently saw a slide presentation on Python generators that went through a lot of neat things you can do with generator functions that allows you to use them like a pipe-and-filter system. They're also evaluated "lazily", so when you process, say, a very long list, it will only process the portion that it needs to of the first list in order to give you the first output of the resulting generator.

    It looks here like you're kind of trying to shoehorn static typing into Python. While there are cases for static typing, I'm not sure it's such a good idea to try and force it into a dynamic language at the application level in this manner. The kind of thing you're trying to prevent can be improved by testing your code on small inputs

    And lastly, if you're trying to annotate functions with metadata about their return types, it would be better to use a decorator for the function. For instance, if you were heartset on using types, you could use something like this example decorator from the Decorators PEP:

    def attrs(**kwds):
        def decorate(f):
            for k in kwds:
                setattr(f, k, kwds[k])
            return f
        return decorate
    
    @attrs(argument_types=(int, int,),
           returns=int)
    def add(a, b):
        return a + b
    

    Then, rather than calling the function with a f_inspect parameter, you could just access the add.argument_types and add.returns member variables of add.