Elephants, war rooms and 30 petabytes – this is what it takes to keep Facebook’s data in check according to one of its most recent blog posts. After building out its new data center in Prineville, ...
A monthly overview of things you need to know as an architect or aspiring architect. Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with ...
A monthly overview of things you need to know as an architect or aspiring architect. Unlock the full InfoQ experience by logging in! Stay updated with your favorite authors and topics, engage with ...
A prism is used to separate a ray of light into its contingent frequencies, spreading them out into a rainbow. Maybe Facebook’s Prism will not make rainbows, but it will have the ability to split ...
As the world’s largest social network, Facebook accumulates more data in a single day than many good size companies generate in a year. Facebook stores much of the data on its massive Hadoop cluster, ...
With a billion users and requirements to analyze more than 105 terabytes every 30 minutes, Facebook’s appetite for crunching data has reached Godzilla-like proportions. Much of that data — including ...
Today’s big story in the enterprise and infrastructure sector is, of course, the $2.5 billion acquisition by EMC of storage solution Isilon. The announcement had no sooner hit the wires when the long ...
Facebook has beaten some of the limitations of the Apache Hadoop data processing platform, its engineers assert. Facebook has released source code for scheduling workloads on the Apache Hadoop data ...
Facebook has revealed it developed a replication system to move a 30 petabyte (PB) file system to a new data centre in Oregon. Facebook’s data warehouse Hadoop cluster grew 10 PB over a the year to ...
Running what they believe is the world’s largest Hadoop-based collection of data, Facebook engineers have developed a way to circumnavigate a core weakness of the data analysis platform, that of ...