Note that i am relatively new to coding and have little to no proper expertise on it. I wrote what i am assuming is a simple multiplication function, and after running it, there was literally nothing in the console. Wrote this in Pycharm by the way. Thank you.
def multip(num2,num1):
num1 = int(input("Give me a value for num1:"))
num2 = int(input("Do the same for num2:"))
print(num1 * num2)
multip(num2, num1)
I then rewrote it but set the two variables to initial values of 0, making them global variables. Can someone explain to me why i had to do that for it to work. New code below:
num1 = 0
num2= 0
def multip(num2,num1):
num1 = int(input("Give me a value for num1:"))
num2 = int(input("Do the same for num2:"))
print(num1 * num2)
multip(num2, num1)
Ran it and got nothing, till i initiated the variables to 0
I think your code is like this:
def multip(num2,num1):
num1 = int(input("Give me a value for num1:"))
num2 = int(input("Do the same for num2:"))
print(num1 * num2)
multip(num2, num1)
There is something wrong here, at the last line, you wrote multip(num2, num1)
but actually you didn't define any of those above, so an error will occur (NameError) and your code will not continue to run.
That's why if you initialize 2 variables num1 and num2, you will not get an error, and your program will run smoothly.