Finding Facebook Fanpage Of Startups Selenium And Facepy Usage

As a regular activity of the Software Club at my college, we have a weekly meetups where we discuss various ideas and code anything that is possible within an hour or so. We make groups of 3 or 4 and each group works on different ideas. At this point I really feel tech giants(Google, FB, etc…) should also consider colleges at Nepal and similar countries for their internship programs. Yesterday(Jan 3), we discussed several ideas and my group worked on something cool too. However, I will only talk about my portion.

In a nutshell, I extracted all the startups in Nepal and found their facebook page. The data was then used by other members in my group to do something cool which I can’t discuss here.

Extract startups in Nepal and find FB page

from selenium import webdriver
from facepy import GraphAPI
import json
import time 

startup_fan_pages = {}


access_token = "access_token"   # get it here https://developers.facebook.com/tools/explorer/

graph = GraphAPI(access_token)

browser = webdriver.Firefox()
browser.get("http://startupsinnepal.com")

time.sleep(40) #wait for the browser to completely load the page


startups = browser.find_elements_by_class_name("panel-title") #returns a list of objects having class="panel-title"
print("startups found")

for startup in startups:
    #print(startup.text)
    r = graph.search(startup.text.lower(), "page", page=False, retry=3) #page=False is to refuse to get a generator, 
    if len(r['data']) > 0:
        startup_fan_pages[r['data'][0]['name']] = str(r['data'][0]['id'])

#print(startup_fan_pages)
with open('startupsinnepalfanpages.json', 'w') as fp:
    json.dump(startup_fan_pages, fp)

Startupsinnepal.com is a listing of all the startups in Nepal. I used selenium to extract all the startups from the website. In order to find their corresponding facebook fan page, I made use of facepy which allows an easy and quick way to make queries to the Graph API. All you need is the access token which you can get from https://developers.facebook.com/tools/explorer/

In the real implementation the data is stored in a google spreadsheet so it is available to the other portion of the program to do further computation. If you are interested on how to push data to spreadsheet via python, go ahead and read Grab alexa rank and write to google spreadsheet using python. Keep the comments coming and please don’t use adblockers(Adsense is the only source for this site’s income), it keeps me motivated to publish good content.

Sorting An Array Containing Json Elements With Specific Key In The Json Using Javascript

- - JavaScript, Tutorials, Web

Example of sorting an array containing json elements

The following function takes two parameter. The prior one expects an array with json elements within it while the later one expects a string i.e the key member of the json on whose basis the array is supposed to be sorted. For this example we would use “first_name” as the key for the basis of sorting. We use .toUpperCase() method to avoid the sorting based upon the small and big alphabetical letters.

We define a variable personal_infromations which holds an array of jsons. We alert the value stored in it  so that the proof of sorting being done is seen clearly. We store the value returned from sortByKey method to a variable sorted_personal_informations. Now when alerting the value, we see that the json has been sorted based upon the key “first_name”.

function sortByKey(array, key) {
    return array.sort(function(a, b) {
        var x = a[key].toUpperCase();
        var y = b[key].toUpperCase();
        return ((x < y) ? -1 : ((x > y) ? 1 : 0));
    });
}
var personal_informations = [{"first_name":"Bhishan","last_name":"Bhandari","email":"bbhishan@gmail.com","country":"Nepal","phone_number":"9849060230"},{"first_name":"Ankit","last_name":"Pradhan","email":"ankit@ankit.com","country":"Nepal","phone_number":"9999999999"}, {"first_name":"Aalok","last_name":"Koirala","email":"aalok@aalok.com","country":"Nepal","phone_number":"8888888888"}, {"first_name":"Subigya","last_name":"Nepal","email":"subigya@subigya.com","country":"USA","phone_number":"6666666666"}];

alert(personal_informations);

var sorted_personal_informations = sortByKey(personal_informations, "first_name");

alert(sorted_personal_informations);

All You Need To Know To Build Applications With Phonegap

- - JavaScript, Phonegap
Warm greetings to my readers and a sincere apology for not being able to publish article the earlier week. The reason being the biggest App Camp in Nepal followed by college and internship. However, these are just excuses. Getting into details of this article : NCELL APP CAMP is about building a product for mobile devices that scales and solves a major problem of the country or worldwide. Having options to compete in categories as follows :

1. Health

2. Tourism

3. Games & Entertainment

4. Utilities

We(group of 5 students) decided to make application to disseminate information and save lives of expected mothers. Idea selection is now complete with 150 top idea selected in various categories. We made it through the top 150(38 selected in Health Category). Needed to build MVP for the second round. You may watch the demo of the application below.

We had around 2 months for the development. We are students and we procrastinate whenever possible. Wait, we had one week remaining for the submission of the MVP with nothing done yet. Saying one week isn’t fair because reducing college time + internship time + assignments. Excuses again :p . Anyway, we had already done the task division. Two of my friends were to design the database and make API. Two others were to write about product, cost estimation, etc, etc (They required us to write a lot). I were to work on the mobile development. It was too late to produce a full fledged application therefore, we opted to make a prototype of the final product we were to have with no API. The product we had was just a prototype with no back-end service. We didn’t make it to the top 26. Procrastination pays good ???? .

I started with phonegap and jquery mobile. It’s easier to build multi-platform applications with phonegap and jquery mobile than it is to brew coffee. I mean it. Let me tell you, it’s a bunch of

$(“.class-name”).on('click', function(e){

});

 

$(“#id-name”).on(‘click’, function(e){

});

 

$(“#id-name”).html(“some html content”)

 

function nameOfFunction(params){

}

 

$(“#id-name”).show();

 

$(“#id-name”).hide();

 

window.location.hash = “#id-name”;

 

window.localStorage.setItem(‘key’, ‘value’);

 

window.localStorage.getItem(‘key”);

 

$.ajax({
type: “GET” // “POST”,
url: “http://someapi”
data: data,
success: function(data){

}, error : function(e){

}
});

While you can make good working application with the above set of outlines. However, most of the time, you need to utilize the API of the OS. For example schedule a notification, send sms, etc. Don’t be surprised when I tell you it’s even easier to have those features in your application than it is to go to a barber to get your hair trimmed.

All you need, is to find a good plugin to talk to the OS’s API to perform things like sending sms via application, scheduling a local notification, etc. Comparably, plugins are the set of barbers with specialization in different hair styles. All you need to do is invoke a method they(plugins) provide at appropriate place like it is to go to a barber who knows how to trim your hair in specific style.

Below is the link to the github repository which contains the codes for the application prototype I built.

https://github.com/bhishan/ncellappcamp

I hope you enjoyed the article. Do comment below to say how you felt the article was or share your view on it. Will come up with a new article another week. Till then Happy Coding ???? Good bye !

Using Import Io To Extract Data Easily Example Usage Of Import Io Api

- - JavaScript, Tutorials, Web
Hey Guys, here in this post I will explain about one the most reliable and easy to use extraction api provider which is an open source tool named import.io

What is import.io?

Import.io is an open source, web-based data extraction platform which enables general users to access data from websites without writing any code. Import.io also serves API for extraction of necessary data which makes it very essential for programmers needing dynamic data from certain websites or data sources. Using import.io’s feature, data from websites can be queried easily either manually or with the use of an API. Using the service, the whole web can be considered as a big database for the machines and other applications to read the essential data which would definitely be more cumbersome while doing manually.

There can be numerous cases where a website does not provide API for data extraction. In cases like this, coding tactics to extract the right piece of information is possible but not very easy. Hence, import.io is the best fit because it provides API to extract the dataset more easily.

Example use of import.io’s API to extract data

1. Open up your browser and type in the following url to get to the coolest platform import.io . SignUp there, it’s free.

http://import.io

2. For this example, we are using a webpage of seasky.org. Below is the url where the dataset we want is present.

http://www.seasky.org/astronomy/astronomy-calendar-2015.html

3. Type in the url in the placeholder as shown below and press the Get Data button. Below is the similar screen-shot.

importdotiodataextract

4. Now we get the dataset we were looking for. Below is the screenshot of how it looks. At the buttom right corner is a button Get API. Press the button to get the API of import.io to extract the dataset. Now press copy this to my data.

importdotiogetdata

5. You now are able to query the dataset in various formats including JSON and TSV. Click on the Get API tab to see the structure of the url for query the dataset.

importdotiogetapi

6. At the end of the url is the parameter _apikey=YOUR API KEY. While using the url in your application you need to replace the YOUR API KEY with the alphanumeric api key provided by import.io. You can view this key by pressing on the parameter and typing your password for import.io account.

Using import.io API in code

Below is the piece of code I am using in my application to get the dynamic data from seasky.org

function getCalenderInfoFromServer(calenderYear){
    $.ajax({
        url: baseUrlDefault + calenderYear + baseUrlEnd + baseAPI,
        type: 'GET',
        dataType: 'json',
        success: function (data) {
            console.log(data);
            parseEventDetails(data.results);
        },
        error: function (e) {
            alert(e);
        }
    });

}

Thanks for reading.

How To Setup And Remotely Access A Linux Machine With Openssh Via Putty Exe

- - Applications, Python, Tutorials
Hello Readers. Here in this post, I will be explaining on how to setup the essentials accessing a device remotely from another device in some other network. Follow along .

To better understand the tutorial, read the background behind writing this article here

How to reserve an ip for your device?

I will be explaining on reserving an ip for a device via router’s settings. My router’s model is TL-WR740N, brand TP-Link. However the sole concept and procedure is similar for all the routers. First, go to your router’s settings,

192.168.0.1 (in my case)

Now provide valid credentials for user and password. For most of the routers, the default username and password is admin

Click on the DHCP option, present in the menu in left side in case of TP-Link

Now go ahead and click on Address Reservation

Click Add New so you can reserve an ip address for a device. You will be required to enter the Mac Address of the device.

Open up terminal with the combination of keys Ctrl + Alt + T in your linux machine and enter the following command.

ifconfig

Since I connect via WI-FI, I look for Hwaddr under wlan0.

Go ahead and paste this mac address to the address reservation menu.

After this, enter a desired ip address, in my case 192.168.0.106. Check enable and save the settings.

Once done, you will be required to reboot the router to save these settings. That’s all for a static local ip.

Setting up openssh-server and port Forwarding

It is dead simple to set up openssh server in linux machine. Follow along,

Enter the following command to get openssh-server

sudo apt-get install openssh-server

Now, to keep the openssh running, hit the following command

sudo service ssh start

The default port for ssh is 22

Another step is to forward the incoming traffic to my router’s public ip in port 22 to the local ip of the linux device. In my case to 192.168.0.106

For this, we again go to the router’s settings. Open up browser and go to 192.168.0.1 or something similar to open the router’s settings.

Find the menu named Forwarding or something similar and click the option. Now under the sub-menu click on virtual servers.

Once done click Add New

Fill the details :

Service Port : 22

Internal Port : 22

IP Address: 192.168.0.106 (the local ip of the machine with openssh running)

Protocol: All (TCP/UDP)

Status: Enabled

Go ahead and save it.

What have we done so far? Assigned a static ip to a device. Installed openssh-server in the linux machine and run it. Forwarded the incomming traffic to port 22 of the router to the local ip of the machine.

Another step. Based on your ISP, the public ip for your internet surfing may change but is static most of the time. In my case the ip barely changes. So all I have to do is rot the public ip which is what I did.

How to access a device remotely via ssh while on Windows machine

Prerequisite: The machine running openssh-server must be powered on. However it is not necessary to login. Also openssh-server remains in running state by default when device is powered on.

Download putty.exe from here. Putty is a telnet and ssh client for windows. The size of the software is very small and you need not install it. It’s a download and run software.

Once downloaded, run putty.exe. Enter the public ip of your home network, specify port 22 and check ssh then press open. This will open a terminal. If the connection was established meaning both the home network and the network you are at is connected to internet, you will be prompted for login credentials of your remote device. Providing the correct login credentials will give you access to your device at home network via command line.

Search for a file and send an email with attachment remotely via ssh and some python

Now that you have access to your linux machine remotely. You can use ls commands to find the location of the file you are looking for.

In my case, it’s present inside /home/bhishan/lispcognitive.odt

To send this file via email, we will be using python interactive shell.

Run the command to open python shell

python

We will be using two standard modules in python namely smtplib and email to send email with an attachment.

The snapshot shows the codes I used to send email

python codes to send email attachment

python codes to send email attachment

Below, I’ve nicely formatted the codes in a more readable way.

import smtplib
from email.mime.application import MIMEApplication
from email.mime.multipart import MIMEMultipart
from email.mine.text import MIMEText
from email.utils import formatdate

me = "fromemail@domain.com"
you = "toemail@domain.com"
password = "password"
msg = MIMEMultipart(
    From = me,
    To = you,
    Date = formatdate(localtime=True),
    Subject = "Sending email with attachment via python"
)
msg.attach(MIMEApplication(file("/home/bhishan/listcognitive.odt").read()))

s = smtplib.SMTP_SSL("smtp.gmail.com")
s.login(me, password)
s.sendmail(me, [you], msg.as_string())
s.quit()

If you have any questions regarding the post or the codes, comment below, so we can discuss.

Mining google search result using python – Making SEO kit with python

- - Applications, Python, Tutorials, Web

whatpageofsearchamion program in python

By the end of this read, you will be able to make a fully functional SEO kit where you can feed the keywords associated with certain url and the program will show the position of the url in google search. If you know some tools like “whatpageofsearchamion.com”, our program will be similar to that tool.

I was writing for a trekking based website and I had to make sure my writeup appears in a good position in google search. The keywords being competitive, I had to keep track of day to day report of the article’s status. Therefore I decided to write a simple program that would find my article’s position in google search. Continue Reading

Mining Facebook – Mining the Social Web using python

- - Python, Tutorials

Mining data from different sources has become a trend for the past few years. The need of structured and characteristic data has lead the data miners to advice their machines to mine social data. Twitter, Google + and Facebook are some of the social networks, now serving as a mountain of behavioral data. In this post Facebook will be our Zen source and python will be the miner. Python programming language is well known for it’s capability to withdraw web data effectively and efficiently. Cutting the long story short, today we will be using Facebook Graph API through python to mine the facebook page likes for each individual adjusting in our friend list. Continue Reading