I've been using Vuforia for a while now which has the limitation that I can't directly submit to the Natural Feature Tracking processor an image for turning into a trackable data file. Instead it is hard wired to take the image directly from the camera, which gives me no control. See for example the UserDefinedTargets
demo.
Does ARToolKit allow me to submit a jpeg to the NFT processor directly from my mobile device for processing ? I want to be able to achieve something like UserDefinedTargets
on Vuforia, but with the ability to submit my own natural feature images as jpegs on the mobile device itself. I can then save images taken on the fly for future processing, or even better, save the processed NFT data for future use. I do not want to use some cloud service, e.g. there is a workaround with Vuforia, but I have to use their cloud service and that has its limitations too !
According to the documentation here: http://artoolkit.org/documentation/doku.php?id=3_Marker_Training:marker_nft_training you have a program that can be used to do the feature extraction. It works with a digital image so without having looked into the code I foresee two options for you:
a) Check out the source code and see if you can get that tool running inside an Android phone, most likely via NDK
b) Make a web service that receives an image, runs this program and returns the result, so you can use it as a normal REST API.
Hope this help.