I've been studying solidity and I've been looking at similar projects already on mainnet ethereum via etherscan. I'm trying to understand what values were used for certain functions of a contract. When I look at it I see stuff like this
Function: someUintFunction1(uint256 maxTxAmount)
MethodID: 0xec28438a
[0]: 0000000000000000000000000000000000000000000000a2a15d09519be00000
Function: someUintFunction2(uint256 _minimumTokensBeforeSwap)
MethodID: 0xf0f165af
[0]: 000000000000000000000000000000000000000000000002b5e3af16b1880000
=
Function: someBoolFunction(bool _enabled)
MethodID: 0xc49b9a80
[0]: 0000000000000000000000000000000000000000000000000000000000000000
I guess 0000000000000000000000000000000000000000000000000000000000000000 as a bool is false?
but how can I decode 0000000000000000000000000000000000000000000000a2a15d09519be00000 and 000000000000000000000000000000000000000000000002b5e3af16b1880000 to a readable value?
but how can I decode 0000000000000000000000000000000000000000000000a2a15d09519be00000 and 000000000000000000000000000000000000000000000002b5e3af16b1880000 to a readable value?
These are values of the uint256
params, converted to hexadecimal, displayed using big endian (the largest value is left).
You can convert it to decimal manually, or here's an example using web3 JS library:
const Web3 = require('web3');
const web3 = new Web3();
const decimal = web3.eth.abi.decodeParameter('uint256', '0000000000000000000000000000000000000000000000a2a15d09519be00000');
console.log(decimal); // prints 3000000000000000000000