0 / 0
Apache HDFS connection
Last updated: Oct 09, 2024
Apache HDFS connection

To access your data in Apache HDFS, create a connection asset for it.

Apache Hadoop Distributed File System (HDFS) is a distributed file system that is designed to run on commodity hardware. Apache HDFS was formerly Hortonworks HDFS.

Create a connection to Apache HDFS

To create the connection asset, you need these connection details. The WebHDFS URL is required.
The available properties in the connection form depend on whether you select Connect to Apache Hive so that you can write tables to the Hive data source.

  • WebHDFS URL to access HDFS.
  • Hive host: Hostname or IP address of the Apache Hive server.
  • Hive database: The database in Apache Hive.
  • Hive port number: The port number of the Apache Hive server. The default value is 10000.
  • Hive HTTP path: The path of the endpoint such as gateway/default/hive when the server is configured for HTTP transport mode.
  • SSL certificate (if required by the Apache Hive server).

Apache HDFS setup

Install and set up a Hadoop cluster

Supported file types

The Apache HDFS connection supports these file types:  Avro, CSV, Delimited text, Excel, JSON, ORC, Parquet, SAS, SAV, SHP, and XML.

Table formats

In addition to Flat file, the Apache HDFS connection supports these Data Lake table formats: Delta Lake and Iceberg.

Learn more

Apache HDFS Users Guide

Parent topic: Supported connections

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more