let us say we have 4 applications which produce 100 data points. So, now we have 400 data points in total. But if we want to find out what would be the sum of the data points in all these following cases 100*100*100*100. How would you do that?
the 100 datapoints for each application are in the form of an array of type[a][b]
where a
and b are 10, 10.
so, for each application it is [app][a][b]
for a given data point.
and the summation seems pretty simple: [app][a][b]+[app1][a][b]...
However, this is where i'm stuck. i don't know how to get the combination of total for example
how would one calculate the sum of combinations of data in C
...that is, 100*100*100*100
datasets.
I missing some math here. if you can help me here it'd be great.
EDIT:
app0 [[17, 24, 85, 43, 4], [92, 6, 17, 62, 20], [72, 100, 59, 84, 67]]
app1 [[83, 8, 95, 74, 61], [95, 84, 15, 70, 89], [6, 91, 13, 85, 43]]
app2 [[88, 98, 86, 52, 32], [37, 1, 96, 43, 72], [10, 62, 76, 100, 35]]
possible data sets are
17+83+88
17+83+98
....
17+83+37
...
I am not sure about whether I've got this correctly, so correct me if I'm wrong:
You have X = 4
amount of... tables, let's say; and then each table has Y = 10
amount of rows, with each row that has Z = 10
amount of columns. You are to get one element from each of the tables and add them together. Each time you add X = 4
elements together, one from each table. You want to group these sums, and then add those sums together into a greater/ultimate sum. Am I right?
If so, it means that in the end you'll end up with (Y * Z) ^ X
, that is (10 * 10) ^ 4 = 100 000 000
individual sums for the given values, which you are regarding as combinations in your question, when they actually are sums of each combination. In the end, you want the sum of sums of each combination, is this what you're after?
Well then, here's what I'm thinking that may help you out:
a[tableindex][rowindex][columnindex]
Shall denote the number in a given cell, indexes shall be zero based. While generating the sums of each combination, things will looks as follows:
a[0][row0][column0] + a[1][row1][column1] + a[2][row2][column2] + a[3][row3][column3]
The table-indexes will remain constant and each one of the row- and column-index will bounce everywhere from 0
to Y - 1 = 9
for rows and to Z - 1 = 9
for columns. Now I want to ask you this, how many times will we encounter a[0][0][0]
as a term?
row0
and column0
will have to be 0
, constant.row1
, row2
and row3
will be able to range from 0
to 9
, Y = 10
possible values.column1
, column2
and column3
will be able to range from 0
to 9
, Z = 10
possible values.Possibilities get multiplied in maths, 1 * 1 * (10 * 10 * 10) * (10 * 10 * 10) = 1 000 000
is the answer. A general formula for that can be written to it as (Y * Z) ^ (X - 1)
, (Y * Z)
because it is the amount of cells in a table, raised to the power of (X - 1)
because there are that many tables excluding the one we've fixed.
Since each table has same amount of cells, and since this example will hold true for every single term; you can simply add every single cell together, then multiply the result by (Y * Z) ^ (X - 1)
, which is a million in for your case. Following code would do that, provided that your numbers are small enough to not cause an overflow:
#define tablecount 4
#define rowcount 10
#define columncount 10
int main(int argc, char const *argv[])
{
int a[tablecount][rowcount][columncount] = { 0 };
// Assuming it gets filled somewhere in between
int thenumber = 0;
for (int table = 0; table < tablecount; table++)
for (int row = 0; row < rowcount; row++)
for (int column = 0; column < columncount; column++)
thenumber += a[table][row][column];
for (int i = 1; i < tablecount; i++) // notice that the initial value for i is 1
thenumber *= rowcount * columncount;
return 0;
}
Make sure to fill the array in-between, and also to include appropriate libraries whenever needed.