Implementing MySQL and Hadoop for Big Data
Many organizations today depend on MySQL for their websites and a big data solution (e.g., Hadoop) for data archiving, storage and analysis (data science). However, integrating MySQL and a big data solution can be challenging. Join Percona Principal Consultant Alexander Rubin as he discusses how to implement a successful big data strategy with Apache Hadoop and MySQL.
Topics to be covered include:
- Introduction to Apache Hadoop and its components including HFDS, Map/Reduce, Hive, HBase/HCatalog, Flume, and Scoop
- How to integrate Hadoop and MySQL using Sqoop and MySQL Applier for Hadoop
- Clickstream logs statistical analysis as an example of big data implementation
- Full Text Search and close to real-time reporting with Hadoop and MySQL
At the end of this webinar, you will understand how to install and configure Hadoop Cluster, integrate MySQL and Hadoop, and implement an enhanced big data solution as a result.
About the Author
Alexander RubinAlexander joined Percona in 2013. Alexander worked with MySQL since 2000 as DBA and Application Developer. Before joining Percona he was doing MySQL consulting as a principal consultant for over 7 years (started with MySQL AB in 2006, then Sun Microsystems and then Oracle). He helped many customers design large, scalable and highly available MySQL systems and optimize MySQL performance. Alexander also helped customers design Big Data stores with Apache Hadoop and related technologies.
How Percona Can Help
If you are considering using Apache Hadoop and MySQL as part of your big data solution, Percona Consulting can help you:
- Evaluate if Hadoop is right for you
- Set up the Hadoop infrastructure and get you up and running quickly
- Tune and optimize your Hadoop setup
- Build and deploy code to answer your questions, and define new KPIs
- Integrate MySQL and Hadoop (load data from MySQL to Hadoop and vice versa)
Contact us now for more information.