Posts Tagged: "web scraping python"

Grab siteprice and write to google spreadsheet using python

- - Applications, Python, Tutorials

By the end of this read you will be able to grab site price from siteprice.org and write it to google spreadsheet using python. Every website has it’s competition. As our website evolves, we have more competitions and the competitors website also earns good value. It is vital to know the value of our website as well as our competition’s value. Siteprice.org is one of those websites which calculates a website’s value based on different factors.

Putting domain name of website in a text file with one domain per line will be our strategy for querying number of websites’s price. You may wish to put hundreds of websites in this txt file which are your competitions.

Python codes to extract site price and write in google spreadsheet

from bs4 import BeautifulSoup
from urllib2 import urlopen
import gdata.spreadsheet.service
import datetime
rowdict = {}
rowdict['date'] = str(datetime.date.today())
spread_sheet_id = '13mX6ALRRtGlfCzyDNCqY-G_AqYV4TpE7rq1ZNNOcD_Q'
worksheet_id = 'od6'
client = gdata.spreadsheet.service.SpreadsheetsService()
client.debug = True
client.email = 'email@domain.com'
client.password = 'password'
client.source = 'siteprice'
client.ProgrammaticLogin()
with open('websitesforprice.txt') as f:
    for line in f:
        soup = BeautifulSoup(urlopen("http://www.siteprice.org/website-worth/" + line).read())
        rowdict['website'] = str(line)
        rowdict['price'] = soup.find(id="lblSitePrice").string
        client.InsertRow(rowdict,spread_sheet_id, worksheet_id)

1. Line 1 to 4

These lines are import statements. Here in this program, we are using various python libraries. Gdata is used to access google spreadsheet. We are using BeautifulSoup because it allows us to get data via id which we will use to get the price of a website. Datetime is used to get the current date. Urlopen us used to open the webpage which contains the data we want.

2.Line 5 to 14

In order to write the extracted rank to google spreadsheet programmatically we are using the gdata module. In order to write to a spreadsheet we need the spreadsheet id, worksheet id and a dictionary containing values we want to write to the spreadsheet. The dictionary contains key as the column header and value as the string that is to be written to the spreadsheet(website, price, date for our program).

Go to docs.google.com when logged in and create a new spreadsheet. Fill the first three columns of the first row as website, price and date respectively. All the letter should be in lower case and no whitespaces. Now when you have created a new spreadsheet, take a look to the url. The url looks something like this one

https://docs.google.com/spreadsheets/d/13mX6ALRRtGlfCzyDNCqY-G_AqYV4TpE7rq1ZNNOcD_Q/edit#gid=0

The spreadsheet id(mentioned earlier) is present in the url.

13mX6ALRRtGlfCzyDNCqY-G_AqYV4TpE7rq1ZNNOcD_Q” in the above url is the spreadsheet id we need. By default the worksheet id is ‘od6‘.

Basically line 5 to 14 are codes to access google spreadsheet.

3. Line 15 to 20

Since we’re writing a program that can extract alexa ranks for hundreds of websites and append it to google spreadsheet, therefore taking url from console input is never a good solution. We have to write the url of websites we want to take care of in a text file. Each website in a single line in the format www.domain.com. Make sure there is a valid website, one in each line because we will read the url from python line by line.

Line 17 makes a soup element out of the url which has the information we are looking for. The soup element is of different websites in each iteration. Line 18 stores the value of the domain in the key “website” of json rowdict. Line 19 stores the price of the website in the key price of json rowdict. You can see we use BeutifulSoup to get data via id. Finally line 20 pushes the entire json element to google spreadsheet. This piece of codes runs for the number of times equal to the line in text file.

Thanks for reading :) Enjoy!! . If you have any questions regarding the post, feel free to comment below.

What are the most interesting web scraping modules for python

- - Python, Tutorials, Web
Python programming language is in the hype for over a decade. It is the most recommended language for the beginner programmers since it’s syntax are readable by almost every non-programmers too. At the same time recommended for web scraping, automation and data science. However python comes short in terms of speed when compared to languages such as C++ and JAVA. The plus for python programming language is the wide range of enthusiastic contributors and users around the globe. There are countless modules for doing various domain specific tasks which makes it even more popular today. From web scraping to gui automation, there are modules for almost everything. Here, in this post, I will list some of the most used and interesting python modules for web scraping that are lifesaver for a programmer.

Popular python modules for web scraping

1. Mechanize

mechanize is a popular python module for it allows creation of a browser instance. It also maintains sessions which aids as a toolkit to obtain tasks like login, signup automation, etc.

2. BeautifulSoup

BeautifulSoup is another beautiful python module which aids scraping the data required from html/xmls via tags. With beautiful you can scrape almost everything because it aids different methods like searching via tags, finding all links, etc.

3. Selenium

Although selenium is a well known module for automated browser testing, it can be used as a web scraping tool as well. I promise you, it pays pretty well. With methods to find element via ids, name, class, etc, selenium would allow you to have anything on the website.

4. lxml

lxml is another wonderful library for parsing xml/htmls, however I would say beautifulsoup beats it in terms of usability. You could option to use any of the modules among lxml and BeautifulSoup since they pretty much do the same stuff.

I have used all of the above modules extensively in my projects and they allowed me to move faster. I was able to do some cool things with these modules. For example: automating conversation between two cleverbots(AI featured bots), getting paid courses at udemy, finding the most popular facebook fan page among my friends,etc. Therefore I totally recommend them. Below is a link to some of the most interesting things I’ve done with these modules.

Cool stuffs with python

Tell me how you felt the article was in the comments section below and/or you can add some of your favorite modules too. There’s always thanks for reading