I am trying to find an efficient way of simulating random meetings in a graph in R using igraph.
I managed to do it using the code below in which I assume edges appear with some probability (prob.meet) and add them to an (pre-existent) empty graph of same size.
However, for large graphs this is not efficient. Plus, I repeat this process over an over.
rbin()
function (amounts to 25% of the time)add_edges()
function (amounts to 75% of the time)Any suggestion on how to improve efficiency?
Here is what I tried:
library(igraph)
nodes = 5
g1 <- barabasi.game(nodes)
EL1 <- get.edgelist(g1, names=FALSE)
g_meet
to conform in size with g1
.prob.meet = 0.5
EL_meet <- matrix(EL1[(as.logical(rbinom(nrow(EL1), 1, prob.meet))),],
nrow=2,byrow = TRUE
)
g_meet <- make_empty_graph(n = nodes) %>%
add_edges(EL_meet)
You can use delete_edges
like below
g1 %>%
delete_edges(which(runif(ecount(.)) > prob.meet))
where
runif() > prob.meet
yields random logical array indicating the removals.ecount
returns the number of edges in graph g1
.which
returns the edge ids that should be removed.f_OP <- function() {
EL1 <- get.edgelist(g1, names = FALSE)
EL_meet <- matrix(EL1[(as.logical(rbinom(nrow(EL1), 1, prob.meet))), ],
nrow = 2, byrow = TRUE
)
make_empty_graph(n = nodes) %>%
add_edges(EL_meet)
}
f_Tim <- function() {
delete_non_meeting_edges <- function(g, prob.meet) {
g <- set_edge_attr(g, "meet", E(g), runif(gsize(g)) < prob.meet)
delete_edges(g, E(g)[!meet])
}
delete_non_meeting_edges(g1, prob.meet)
}
f_TIC <- function() {
g1 %>%
delete_edges(which(runif(ecount(.)) > prob.meet))
}
nodes <- 100000
g1 <- barabasi.game(nodes)
prob.meet <- 0.5
microbenchmark(
f_OP(),
f_Tim(),
f_TIC(),
unit = "relative"
)
and you will see
Unit: relative
expr min lq mean median uq max neval
f_OP() 1.584501 1.583768 1.631061 1.618017 1.675887 1.535542 100
f_Tim() 1.517888 1.520832 1.597230 1.584570 1.679498 1.585434 100
f_TIC() 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 100