Archive of September 2010

Fri 17 Sep

Using the Facebook API to retrieve mutual friends

A few nights ago I had a (bad) idea for a tool that leveraged the Facebook API. I'll spare you the details, but my tool needed to retrieve the list of mutual friends for each of the logged in user's friends. This proved to be a bit trickier to figure out than I had hoped for as a developer new to the Facebook API, so here's a quick little PHP script that shows how I went about solving this problem.

I wound up using the REST API's friends.getMutualFriends query. This code sample uses the new Graph API to retrieve a list of your friends, then displays the profile picture of any who have more than ten mutual friends. Note this would take a while to run on people with a large number of friends.

$facebook = new Facebook(array(
    'appId'  => '<your app id>', 
    'secret' => '<your secret id>',
    'cookie' => true, 
$session = $facebook->getSession();

$my_friends = $facebook->api('/me/friends');  // Graph API call, retrieves own friendlist
foreach ($my_friends['data'] as $person) {
    $friend_uid = $person['id'];

    // Old REST API call. Gets the mutual friends (source must be logged in user).
    $param = array('method' => 'friends.getMutualFriends', 
                                'source_uid' => $me['id'], 
                                'target_uid' => $friend_uid, 
                                'callback' => '' );
    $res = $facebook->api($param);
    if(count($res) > 10) {
            echo "<a href=\"".$friend_uid."\">"
                ."<img src=\"".$friend_uid."/picture\"></a>: "
                .count($res)." friends in common<br>";

Unfortunately, this does not provide a way to retrieve the full friend list of an arbitrary friend of a logged in friend. As far as I can tell, this is not possible using any of the Facebook API's. If you know of a way, certainly leave a note in the comments!

Thu 16 Sep

How to Kill an Unresponsive SSH Session, and other useful escape sequences

I always forget how to do this.

[newline] ~ .

That's enter key, tilda, then period. Presto, back to your friendly (local) console.

Here are a few more useful escape sequences, straight from man ssh (note that all must be preceded by a newline character):

The supported escapes (assuming the default ‘~’) are:

 ~.      Disconnect.

 ~^Z     Background ssh.

 ~#      List forwarded connections.

 ~&      Background ssh at logout when waiting for forwarded connection / X11 sessions to terminate.

 ~?      Display a list of escape characters.

 ~B      Send a BREAK to the remote system (only useful for SSH protocol version 2 and if the peer supports it).

 ~C      Open command line.  Currently this allows the addition of port forwardings using the -L, -R and -D options (see above).  It also allows the cancella‐
         tion of existing remote port-forwardings using -KR[bind_address:]port.  !command allows the user to execute a local command if the PermitLocalCommand
         option is enabled in ssh_config(5).  Basic help is available, using the -h option.

 ~R      Request rekeying of the connection (only useful for SSH protocol version 2 and if the peer supports it).

Nod to this guy, whose site I've always wound up at when I'm trying to remember this.

· Tags: ,

Tue 14 Sep

One-liner to extract a list of link addresses from an HTML file

I'm moving my research group's website to a new server and making some updates at the same time. One of the main things I need to do is make sure links are going to work after the transition. Here is a little one-line shell "script" (if you can call it that) that will extract link addresses from an HTML web page:

wget -q -O - | tr " " "\n" | grep "href" | cut -f2 -d"\""

wget fetches the file and outputs its content to stdout. tr replaces all spaces with newlines, grep filters out every line that doesn't contain an "href", and finally cut displays everything between the first pair of double-quotes.

If you want to use a file you have on your local machine, you can use this variant instead:

tr " " "\n" < [file_name.html]| grep "href" | cut -f2 -d"\""

Obligatory disclaimer: HTML is NOT a regular language and in general cannot be parsed with regex's as is done here. This is not guaranteed to work.

· Tags: , ,

Fri 10 Sep

Landfill Diaries

I just discovered this fascinating blog, Landfill Diaries. From the about page:

"Landfill Diaries is maintained by Marijke Rijsberman, who accidentally fell in love with landfill in 2001, as she sat and watched the sun rise over San Francisco Bay at Candlestick Point. That put her almost on top of the city’s old garbage dump, right next to the current transfer station, poised to notice that endless stream of garbage trucks trundling by.

Authorities refer to garbage as the waste stream, as if, like a mountain brook, it burbles pleasantly to its final destination with no further encouragement besides gravity. The reality is a little different. Watching the sun rise and counting trucks, I finally got it: our garbage is huge and intractable. An almost invisible but very large industry takes care of it, behind the scenes, in places we rarely visit, don’t worry about, and usually don’t inquire into.

Even though our castoffs weigh on the land, it’s still possible just to drag our garbage to the curb every week and never think about it again. These pages are meant to change that situation at least a tiny bit: they examine and trace and pay respects to our invisible garbage. Landfill is not just real, it’s ours. And it is, upon closer inspection, very interesting."

And so is this blog. Definitely worth your time to check it out, especially if you've ever had an interest in the places your trash goes after the truck picks it up.

Sat 4 Sep A time-management tool for Ubuntu is a simple script to help increase your focus and productivity, in the Pomodoro style. It integrates nicely with Ubuntu 9.04+ via libnotify, and disables network access during "focus" periods. OSX support is in the works (but don't hold your breath).

After you run, you'll have ten minutes of uninterrupted work time, which is enforced by disabling your network adapters to prevent you from browsing the internet. When ten minutes is up, you've earned a two minute break, and your network adapters will be brought back up.

Download at Github

Let me know if you find this useful (or if you find problems that prevent it from being useful to you)!