1

I want to translate bulk numbers of short url's coming streamed from twitter. Rather than accessing each individual request I want to use API's that accept a list of short or tiny URL's and return the original URL's. Are such API's available?

CloudyMarble
  • 36,908
  • 70
  • 97
  • 130

6 Answers6

4

Not really an API, but this will give you the URL really fast.

curl -I insert short URL here | grep Location | awk '{print $2}'

Laurel
  • 5,965
  • 14
  • 31
  • 57
gahooa
  • 131,293
  • 12
  • 98
  • 101
4

99% of all url openers have an API.

For example, there's a PEAR package (PHP) called Services_ShortURL that supports:

  • bit.ly
  • digg
  • is.gd
  • short.ie
  • tinyurl.com
Joel Coehoorn
  • 399,467
  • 113
  • 570
  • 794
Till
  • 22,236
  • 4
  • 59
  • 89
1

There are a few web-sites around that are dedicated services to converting shortened URLs back to their original.

Two I know of that have APIs are LongURL and Untiny.me. I'm in the middle of writing a java library to use both of these.

Evan
  • 18,183
  • 8
  • 41
  • 48
1

I had written a small script to turn short urls to it's original links. It's based on the http header returned by the short urls.

vsr
  • 3,138
  • 1
  • 19
  • 14
0

Have a look at bit.ly API or budurl.com API

Laurel
  • 5,965
  • 14
  • 31
  • 57
Sani Huttunen
  • 23,620
  • 6
  • 72
  • 79
0

From Untiny.me's online service, this was useful: http://untiny.me/api/1.0/extract/?format=text&url=bit.ly/GFscreener12

So conceivably a simple Bash script reading each line as a short URL would work:

#!/bin/bash
# urlexpander.sh by MarcosK
 while read URLline; do
  curl -s "untiny.me/api/1.0/extract/?format=text&url=$URLline"
 done

To test, feed it a single URL with echo "bit.ly/GFscreener12" | ./urlexpander.sh or send it your whole input file, one short URL per line, with:

cat urllist.txt | ./urlexpander.sh
Marcos
  • 4,796
  • 5
  • 40
  • 64
  • However for my own code I actually like [gahooa's method](http://stackoverflow.com/a/1134628/1069375) better because it doesn't need to check a 3rd party service nor make an extra web fetch, and most importantly, works equally well with standard URLs not just short--returning them unchanged. So change the `curl` line inside the while loop to `curl -sI "$URLline" | grep Location | awk '{print $2}'` – Marcos Feb 17 '12 at 22:01