Search code examples
python-3.xweb-crawlergoogle-crawlers

I was using the python crawler. why cause of outbreak the HTTP 405 Error?


I am not good at English. So please be good to understand my awkward English.

I tryed using the HTTP POST to crawling Google

but some problem have arisen.

Problem is the HTTP 405 Error occurred in the output page

This is python 3.5.1 source

import requests
from bs4 import BeautifulSoup

def image_upload():
    filePath = 'C:/test.jpg'
    searchUrl = 'http://www.google.com/searchbyimage/upload'
    multipart = {'encoded_image': (filePath, open(filePath, 'rb')), 'image_content': ''}
    response = requests.post(searchUrl, files=multipart, allow_redirects=False)
    plain_text = response.text
    soup = BeautifulSoup(plain_text,"html.parser")
    for link in soup.find_all('a'):
        return  link.get('href')

def Crawling(target_link):
    response = requests.post(target_link)
    html_content = response.text.encode(response.encoding)
    soup = BeautifulSoup(html_content, "html.parser")
    edutData = soup.find_all('a', {'class':'bili uh_r rg_el uvg-i'})
    print(soup)

iamge_link = image_upload()
print(iamge_link)
Crawling(iamge_link)

Why in output page the HTTP 405 Error happen probluem?


Solution

  • Http Response 405 is returned when request method you are using is not supported. In your case post method might not be supported by the endpoint you are accessing.

    For reference, Http Response 405