I am trying to get the HTML of a URL on my own site. The site is built with laravel 5.4.
I am trying to get the HTML content of an endpoint so i can store it in the database as plain text.
but for some reason i keep getting continuous loading, even though i'm running on localhost.
this is what i've tried according to some questions (Can't seem to get a web page's contents via cURL - user agent and HTTP headers both set?) i've seen here on stack overflow:
$url = url("template/1/11"); // http://localhost:8000/template/1/11
$html = file_get_contents($url);
AND
$url = url("template/1/11"); // http://localhost:8000/template/1/11
$c = curl_init($url);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
//curl_setopt(... other options you want...)
$html = curl_exec($c);
if (curl_error($c))
die(curl_error($c));
// Get the status code
$status = curl_getinfo($c, CURLINFO_HTTP_CODE);
curl_close($c);
Is there a reason i'm getting this behavior? Please help i just need the HTML content of the URL
You have to specify the file's extension in your URL.
If your file is named index.html
and is inside a folder called template
, it's incorrect to use url as template/index
, rather, you have to change it to template/index.html
.
EDIT:
I had misread a part of the question. Apparently file_get_contents()
doesn't work on Laravel routes, I suggest you to refer to this answer.
From the link:
$html = View::make($url)->render();
You might have to adjust the url if it's not pointing to the right route.