Ich möchte Web-verschrottet Daten direkt in PostgreSQL importieren, ohne es zu .csv in erster Linie zu exportieren.Scraped Daten von der Website direkt in PostgreSQL importieren
Hier ist der Code, den ich verwende, um Daten in CSV-Datei zu exportieren, dann importiere ich es manuell. Jede mögliche Hilfe würde geschätzt werden
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
my_url = 'http://tis.nhai.gov.in/TollInformation?TollPlazaID=236'
uClient = uReq(my_url)
page1_html = uClient.read()
uClient.close()
#html parsing
page1_soup = soup(page1_html,"html.parser")
filename = "TollDetail12.csv"
f = open(filename,"w")
headers = "ID, tollname, location, highwayNumber\n"
f.write(headers)
#grabing data
containers = page1_soup.findAll("div",{"class":"PA15"})
for container in containers:
toll_name = container.p.b.text
search1 = container.findAll('b')
highway_number = search1[1].text
location = list(container.p.descendants)[10]
ID = my_url[my_url.find("?"):]
mystr = ID.strip("?")
print("ID: " + mystr)
print("toll_name: " + toll_name)
print("location: " + location)
print("highway_number: " + highway_number)
f.write(mystr + "," + toll_name + "," + location + "," + highway_number.replace(",","|") + "\n")
f.close()
[in Daten in postgresql einfügen] (http://www.postgresqltutorial.com/postgresql-python/insert/) lesen Sie dies. Es wird Ihnen helfen, Ihr Problem zu lösen. –