In computing, linked data (often capitalized as Linked Data) is a method of publishing structured data so that it can be interlinked and become more useful through semantic queries. It builds upon standard Web technologies such as , RDF and URIs, but rather than using them to serve web pages for human readers, it extends them to share information in a way that can be read automatically by computers. This enables data from different sources to be connected and queried.
Tim Berners-Lee, director of the World Wide Web Consortium (W3C), coined the term in a 2006 design note about the Semantic Web project.
Tim Berners-Lee outlined four principles of linked data in his "Linked Data" note of 2006, paraphrased along the following lines:
Tim Berners-Lee gave a presentation on linked data at the TED 2009 conference. In it, he restated the linked data principles as three "extremely simple" rules:
Linked open data is linked data that is open content. Tim Berners-Lee gives the clearest definition of linked open data in differentiation with linked data.
Linked Open Data (LOD) is Linked Data which is released under an open licence, which does not impede its reuse for free.
Large linked open data sets include DBpedia and Freebase (now ).
The term "linked open data" has been in use since at least February 2007, when the "Linking Open Data" mailing list was created. The mailing list was initially hosted by the SIMILE project at the Massachusetts Institute of Technology.
The goal of the W3C Semantic Web Education and Outreach group's Linking Open Data community project is to extend the Web with a data commons by publishing various open datasets as RDF on the Web and by setting RDF links between data items from different data sources. In October 2007, datasets consisted of over two billion RDF triples, which were interlinked by over two million RDF links. By September 2011 this had grown to 31 billion RDF triples, interlinked by around 504 million RDF links. A detailed statistical breakdown was published in 2014.