I have scrapyd and spider installed on a Unix machine, and everything works fine when I run
curl http://localhost:6800/schedule.json -d project=myproject -d spider=somespider
I can see the job status,logs and items on web interface of scrapyd API. in short everything is working as expected.
Now I want to start a spider programmatically by making a http post to the API in ASP.Net using C# as scrapyd will be part of my .NET project but I get
{"status": "error", "message": "'project'"}
I have found an example http://mahmoud.abdel-fattah.net/2012/07/04/super-simple-and-basic-scrapyd-web-interface/comment-page-1/ it makes a Jquery post and this example works for me but below one is not working for me
public void StartCrawler()
{
var httpWebRequest = (HttpWebRequest)WebRequest.Create("http://mydomain.com:6800/schedule.json");
httpWebRequest.ContentType = "application/json; charset=utf-8";
//httpWebRequest.ContentType = "text/json;; charset=utf-8";
httpWebRequest.Method = "POST";
using (var streamWriter = new StreamWriter(httpWebRequest.GetRequestStream()))
{
string json = "{\"project\":\"projectname\",\"spider\":\"spidername\"}";
streamWriter.Write(json);
}
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var responseText = streamReader.ReadToEnd();
}
}
please tell me what I am doing wrong
I have solved it by
public static string StartCraling(string URI, string Parameters)
{
WebRequest req = WebRequest.Create(URI);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(Parameters);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length); //Push it out there
}
using (WebResponse resp = req.GetResponse())
{
using (StreamReader sr = new System.IO.StreamReader(resp.GetResponseStream()))
{
return sr.ReadToEnd().Trim();
}
}
}