I have created a UDF which works pretty well as it is. However, if the second range is not submitted (it is optional) it will be set to a range inside the UDF.
If i now try to return the range of one cell directly i need to add .Cells
to the range to select the cell with (x, y)
.
If no range is submitted it does something like that:
Set optional_range = required_range.Columns(2)
Set required_range = required_range.Columns(1)
If I later in the UDF want to output a cell from optional_range
i get this behavior:
Set MyFunction = optional_range(x, y) 'cell shows #VALUE
Set MyFunction = optional_range.Cells(x, y) 'shows correct value
But as said: if optional_range
gets a range from the formula directly, it shows the correct value also without the use of .Cells
.
I can't find a reason for this behavior at all. Can someone tell me why this happens?
The full code can be found here.
Your error is occurring due to setting optional_range to the Range.Columns property and then attempting to piece it out into individual Range.Cells objects. The Range.Columns and Range.Rows property properties share many methods with a Range object but this is not one of them.
The solution is to simply explicitly set optional_range to the .Cells of .Columns(2).
Set optional_range = required_range.Columns(2).Cells
'optional truncation to the worksheet's .UsedRange
Set optional_range = Intersect(required_range.Parent.UsedRange, required_range.Columns(2))
I have added a way to truncate .Columns(2) down to the Worksheet.UsedRange property. Think of it as SUMIF vs. SUMPRODUCT with full column references; particularly helpful if you plan to loop through the cells in optional_range. No need to add .Cells to this; the Intersect method returns the cells in the intersection as a true Range object.