This article by Google web performance engineer and developer advocate Ilya Grigorik is a great introduction to Google's BigQuery, which you can use to peruse some performance statistics about 300,000 of the web's most popular websites. BigQuery provides all of this data through HTTP Archive, a project that crawls those hundreds of thousands of websites twice a month and takes some statistics about how the sites are built and what kind of performance they boast.
Using SQL queries, you can search HTTP Archive on BigQuery to answer virtually any question you may have about web performance. How fast do the fastest websites loads? What makes them fast? How are they built? What libraries do they use?
Here's a talk by Grigorik from Velocity 2013, in which he talks about using BigQuery to search the almost 400 GB of raw CSV data from HTTP Archive: