This section describes how to migrate customized components to Hadoop for the HPE Ezmeral Data Fabric.
Hadoop for the HPE Ezmeral Data Fabric features the complete Hadoop distribution including components such as Hive. There are a few things to know about migrating Hive, or about migrating custom components you have patched yourself.
If you have applied your own patches to a component and wish to continue to use that customized component with the data-fabric distribution, you should keep the following considerations in mind:
hdfs:// or maprfs:// into your applications.
This is also true of Hadoop ecosystem components that are not included in the data-fabric Hadoop distribution
(such as Cascading). For more information see Working with
filesystem.