I want to translate bulk numbers of short url's coming streamed from twitter. Rather than accessing each individual request I want to use API's that accept a list of short or tiny URL's and return the original URL's. Are such API's available?
Asked
Active
Viewed 1,286 times
1
-
possible duplicate of http://stackoverflow.com/questions/902192/how-to-get-long-url-from-short-url – CoderDennis Jul 15 '09 at 23:23
6 Answers
4
99% of all url openers have an API.
For example, there's a PEAR package (PHP) called Services_ShortURL that supports:
- bit.ly
- digg
- is.gd
- short.ie
- tinyurl.com

Joel Coehoorn
- 399,467
- 113
- 570
- 794

Till
- 22,236
- 4
- 59
- 89
0
Have a look at bit.ly API or budurl.com API

Laurel
- 5,965
- 14
- 31
- 57

Sani Huttunen
- 23,620
- 6
- 72
- 79
-
That is the bit.ly API... How is it going to help with all the other short URL services? – gahooa Jul 15 '09 at 23:27
-
0
From Untiny.me's online service, this was useful:
http://untiny.me/api/1.0/extract/?format=text&url=bit.ly/GFscreener12
So conceivably a simple Bash script reading each line as a short URL would work:
#!/bin/bash
# urlexpander.sh by MarcosK
while read URLline; do
curl -s "untiny.me/api/1.0/extract/?format=text&url=$URLline"
done
To test, feed it a single URL with echo "bit.ly/GFscreener12" | ./urlexpander.sh
or send it your whole input file, one short URL per line, with:
cat urllist.txt | ./urlexpander.sh

Marcos
- 4,796
- 5
- 40
- 64
-
However for my own code I actually like [gahooa's method](http://stackoverflow.com/a/1134628/1069375) better because it doesn't need to check a 3rd party service nor make an extra web fetch, and most importantly, works equally well with standard URLs not just short--returning them unchanged. So change the `curl` line inside the while loop to `curl -sI "$URLline" | grep Location | awk '{print $2}'` – Marcos Feb 17 '12 at 22:01