Hortonworks DataFlow is an integrated platform that makes data ingestion fast, easy, and secure. Download the white paper now. Brought to you in partnership with Hortonworks.
What is Big data?
This is one of the buzz words in today's world and has gained the
attention of many industries. By looking at the name one might think it
as data in "BIG" format. But there is much more than that. Today the
rate of data generation is so huge and there are several dimensions that
Big data spans three dimensions: Volume, Velocity and Variety.
Volume – Big data comes in one size: large. Enterprises are awash
with data, easily amassing terabytes and even petabytes of information.
Velocity – Often time-sensitive, big data must be used as it is
streaming in to the enterprise in order to maximize its value to the
Variety – Big data extends beyond structured data, including
unstructured data of all varieties: text, audio, video, click streams,
log files and more."
The most interesting challenge as well as the opportunity comes for
companies who deals with Big data. You cannot rely on conventional
method to store, load and analyse big data. If the companies do not
carefully model their big data problems using all 3
mentioned dimensions the friction they will have in the future to growth
will be significant. It is no doubt that huge reinvestment will be
needed to re design the whole system to meet the requirements.
Here are some articles which discuss about those issues in big data  
Hortonworks Sandbox is a personal, portable Apache Hadoop® environment that comes with dozens of interactive Hadoop and it's ecosystem tutorials and the most exciting developments from the latest HDP distribution, brought to you in partnership with Hortonworks.