According to my data, the number of new users learning about Bitcoin is still decreasing.
(This is an index number based on statistics from websites/webpages explaining about bitcoin.)
..(huge image removed)..
Today's index (not plotted here but on hourly chart) looks like it's going to be a new yearly low for a Friday.
This is the most interesting thing I've seen on the wall thread in at least a week: Lots of new bits of data. Please, please give us more color on what is being charted, otherwise the bits are wasted.
tl;drLots more people were looking up information about bitcoin when it was peaking in price. That chart starts in January when the price was still over $800. As the price relaxed there was less interest (other than when significant, fast drops in price occurred)
The chart would be FAR more meaningful if it extended back to about april of last year, when the price was stable at $90-120, so you could see the change in interest from then through the price jump and to now. I am sure its a few times higher now than a year ago
Aminorex: There are a lot of websites and webpages explaining about Bitcoin. If the statistical data is not public (only a few are), i figure out on what type of machine the website is running. Depending on the type of machine, i use common techniques such as remote resource load monitoring trough webserver response times to create an index number based on the difference compared to average load at stable times.
It looks like the data is dominated by the wikipedia page statistics, yet it is weighted relatively low considering it is linked within many of the other resources i get the data from.
It takes some work to consistently get an evenly weighted index number for the graph due to sites disappearing and new ones being made. To be honest, it is quite a mess due to lack of interest on my part to keep up

but hey, i'm a hacker (the good kind), not a financial analyst

Klondike_bar: The script has been monitoring (and mirroring) this data for bitcoin, litecoin, peercoin and feathercoin since about june 2012. However, the data folder is over 1300Gb and the code to analyze it is slow, even after a lot of efficiency updates. I'ts not exactly high end hardware. I have it running right now and i was wondering the same thing as you, so the last couple of days i'm spending my time trying to speed up the process.
tl;drUsed some fairly unknown techniques to get usage data from websites combined with publicly available data.
Hacking around my code to speed up the 1TB+ of data filter to generate graph back to april 2013.
Running optimized on 6 CPU´s, graph from 01 January 2013 until 16 May 2014 should be done in about 3-4 days
Would be an easy way of predicting big differences in short term demand and possible price. Therefore i predict it won't work. We'll see in a couple of days