I have problems in understanding how a class variable in Torch works.
I did the following:
mydata=torch.class('something')
I checked the User variable by typing who()
and it shows:
== User Variables ==
[_RESULT] = table - size: 0
[mydata] = table - size: 0
[something] = table - size: 0
I first of all tried to delete mydata
by
mydata=nil
it works. mydata
is now freed and can be reinitialized to any values. But when I tried to delete the variable something
by typing
soemthing=nil
It seems it's not working even though the variable something
was not list in who()
anymore. When I try:
mydata2=torch.class('something')
the error pops out:
/data/torch/install/share/lua/5.1/torch/init.lua:65: something has been already assigned a factory
stack traceback:
[C]: in function 'newmetatable'
/data/torch/install/share/lua/5.1/torch/init.lua:65: in function 'class'
[string "mydata2=torch.class('something')"]:1: in main chunk
[C]: in function 'xpcall'
/data/torch/install/share/lua/5.1/trepl/init.lua:648: in function 'repl'
/data/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:185: in main chunk
[C]: at 0x00406670
Could anyone tell me the reason behind it?
torch.class() stores the class metatable in the lua-registry, see http://www.lua.org/pil/27.3.1.html and the luaT_lua_newmetatable() function in the torch C-backend.
In order to unregister an existing class it is necessary to remove the entry from the lua-registry. You can access the registry from lua with help of the debug.getregistry() function.
With removeal from registry your example works:
mydata = torch.class('something')
mydata = nil
soemthing = nil
-- remove the global registration:
debug.getregistry()['something'] = nil
-- now it is possible to register the class again
mydata2 = torch.class('something')