Finding a song by its lyrics: Play By Lyrics .com

Tags: websites, webdevelopment, music.
By lucb1e on 2012-05-27 03:56:22 +0100

Most readers of this blog will know how you can find a song by its lyrics: You just google the lyrics you remember plus the word "lyrics" itself, and it will often come up with the song you want. Then you copy (or type) the artist and song name over to Grooveshark or Youtube, and you can listen to it.

I'm having this quite a lot lately, knowing the lyrics but not the song name, and the process of putting it to play seemed a bit double work. First googling, then on youtube, then clicking the result... So I decided to take action :)
Play By Lyrics.com is the result!

I've toyed with the idea of searching for lyrics within your current music library before, but it was quite a lot of work to figure out how to read your Grooveshark library and then find the lyrics for it and such, so I never got to. Just automating the process of copying the Google result and clicking the one on Youtube is much faster to do, and for the user it probably can't get any easier than I made it now.

I had the basic thing up and running in 20 minutes I think, and it just worked awesome. Whatever I gave it, it would almost every time give me the right video at once. If not, it was my fault because I gave it too little information. There wasn't much of a GUI yet though, but the concept worked. Only after doing this I realized that there must have been many people before me.

Searching on Google, nobody seemed to have made this before. But it's hard to search anyway, you get results like "find song lyrics." I thought that playbylyrics.com would certainly give me a site, but it didn't! Could not find server... Wait, that meant the domain wasn't taken yet?
A quick lookup made clear that it wasn't.

Deliberating for 10 minutes (well not continuously), I decided that 15 Euros weren't going to kill me. I'd see how popular it was going to get. I'm not after the money, but if just 30 people would use it per month I'd be happy. To compensate hosting costs I would probably add ads later, but in a way that they were very unobtrusive (and only text ads, no flashy images).

Getting the .com domain was surprisingly quick. Paying via the Dutch iDEAL, which is pretty much instant, I bought and had access on the admin panel of the registrar within minutes. Usually you have to wait for a zone update, but apparently .com domains don't have that (.nl does). Hadn't remembered that from when registering lucb1e.com, but it was a pleasant surprise :)

I pointed it to the right IP, added the vhost in Apache, and that was pretty much it. From idea to working product in under 40 minutes. It surprised me how quickly it actually all went.

After that I did some polishing. A decent font, favicon, and a contact/about page. Then I let it sit for a day, coming up with more ideas the next. I added a privacy and disclaimer page, did some more layout polishing, and I realized that Google actually highly throttles their (depricated) search API.

So I had to come up with something better for the API. I had a bit of a fight with it, but it works great now. It's as unlimited as anything, as long as I'm not sending out 3000 requests per minute it should be totally fine.
The Youtube API is still limited though, but they don't specify any limit in the documentation. When resources run out they're going to throttle me... Right. I'll just have to keep watch.

Letting more days pass, I thought of a feedback page and tweaked some things on the text and the working of the back-end system. Right now I got plenty of ideas to add, but they are all upgrades rather than improvements to the current way it works. I decided it was time to tell the world about it :)

Like with this website, one of the main design goals was speed. The one thing I couldn't enhance is how fast it can lookup results on Google. I thought of saving every result for quicker access, or using Google suggestions to autocomplete the word a user was typing and prefetch results. In the end, all ideas only improve the speed very marginally and load the service (and Google, not to forget*) a lot more.

* They might sit on a huge pile on money and got more than enough capacity, but that doesn't make me feel like sending many requests per second to hopefully speed up user's searches by two tenth of a second.

The loading time of the homepage is under a tenth of a second (including network latency) when trying from internship, which is faster than blinking. Search results take a half second, something I thought was huge. Somehow I (and people whom I asked to betatest) experienced it as very fast though, and there isn't anything I can do about it anyway.

Your results might take longer sometimes. Google gives 10 search results. It will first try searching Youtube for the top one. If that doesn't turn up any Youtube results, it will try searching Youtube for the second Google result. And so on, up to the third Google result (more would be overloading the service and after 3 attempts I think the odds are low there will be anything).

Doing this two times, say the second Google result does turn up Youtube results, takes more time. I'm not sure how much, but querying Google + Youtube is 500ms. Querying Google + Youtube + Youtube another time, I guess 750ms.

If Google itself doesn't have any results, it will also search Youtube directly, with and without they "lyrics" keyword. This too might slow it down a bit.

To speed up the site even further, I avoided using PHP where I didn't need to. It's overkill really, I just like to get it as fast as really possible without migrating hosting to Amazon's cloud or something like that.
The About and Disclaimer page are both html files so that Apache doesn't have to run it through the PHP parser first. The only thing I didn't yet implement is client-side caching of the homepage and any results.

Another thing I did was placing the css inline. Yeah it might be considered bad practice, but even Google does it and it prevents an ugly glitch: It might load the page before the CSS loads, making you see it shift things around. Having it inline makes sure it always loaded before the rest, and it renders the page entirely correct at once. Maybe it's actually slower, not being able to cache the resource, but I noticed that a page which doesn't have to re-render or shift things around looks much like it's loading instantly. The only problem is network latency for non-EU hosts I guess. But the CSS isn't much, it doesn't take up an ethernet frame or anything, so it won't be noticeably slower than with  cached CSS.


So next to speed, what other features has it got?
- I thought of making an autoreplay script myself, but it turns out to be more work than I expected. Also I just can't beat the sliders on infinitelooper.com. I can replicate it, but it won't get any better so why bother. So I just have a link to that site now for quick and easy auto-replay.

- There is another link to the lyrics it found.

- It automatically focusses the search field, so you don't have to click it to change your search query and search again.

- There is a link to try another search result incase you got a wrong or bad quality video.


Another thing I wanted to add was prefetching the next result in Javascript rather than having to load a new page and waste a half second of your life. But this would break the 'feature' that you can share links with others and expect it to give you the same result. If you clicked a prefetched result, then copied the URL and passed it to a friend, he would get another song (namely, the first result).

I would use pushState for this, but browser support is really bad. It's your own fault if you use Internet Explorer... but on the other hand, I do make this for the users, not just the tech world.

Using a fragment identifier (/?lyrics=x#result-2) also isn't really great. They are meant to go to a certain position on the page, not to get a completely different result. It's not even sent to the server, which will try and load the first result before the Javascript on the page realizes that it says #result-2 in the URL. And things just get overly complicated, it's not a nice solution...

What I think I'll do is prefetch the next result on the server so that a pageload goes almost as fast as loading the front page (it then doesn't need to query any API anymore).


One funny effect this project had on me is that I've been listening to music a lot more last days. The first day I needed to come up with lyrics to test it, and that was harder than you might expect (not that hard either, though). So when testing more on subsequent days, I guess it made me remember more and more music that I wanted to listen to.

My current favorite: let the only sound be the overflow pockets full of stones
Kudos if you can guess it already :)

One problem with this specific song is that Youtube's encoding makes it sound bad, especially noticeable at 240p on 1:19, but it's also present on 360p+. The same song on Grooveshark hasn't got this problem.


Enjoy!
lucb1e.com
Tip others about this page:   Share via Facebook  Share via Twitter  Share via Google+  Submit to Reddit  Recommend at StumbleUpon  Share by e-mail Submit at Hacker News


Comments:
Comments powered by Disqus
Another post tagged 'websites': On security questions

Look for more posts tagged music, webdevelopment or websites.

Previous post - Next post