After some research, it appears that the hdfs uses
its own tcp stack and does not use APache httpclient.
So, it cannot be directly used with netcdf-java.
I think the proper approach is use our Amazon S3 code
as a template to build hdfs access. If you are interested
in doing this, we would welcome the effort.
=Dennis Heimbigner
Unidata
On 7/5/2016 10:51 AM, Sean Arms wrote:
> Greetings Shashi!
>
> netCDF-Java uses a wrapper around the apache http client library,
> called httpservices.
> While httpservices knows about the apache http client, it does not know
> about the hadoop http client, which is the cause of the error you are
> seeing.
>
> Dennis - do you know what it would take to support http access to hadoop (
> hdfs) via httpservices?
>
> Sean
>
>
> On Sat, Jul 2, 2016 at 1:37 AM, shashi kumar <shashi.fengshui@xxxxxxxxx>
> wrote:
>
>> Dear Sean,
>>
>> I am getting the below error when i use the netcdf-all jar on my hadoop
>> project for analysing the nc file . Kindly help me in identifying the issue
>> .
>>
>> ucar.httpservices.HTTPException: ucar.httpservices.HTTPException:
>> org.apache.http.conn.UnsupportedSchemeException: hdfs protocol is not
>> supported
>> at ucar.httpservices.HTTPMethod.execute(HTTPMethod.java:335)
>> at ucar.nc2.dataset.NetcdfDataset.checkIfDods(NetcdfDataset.java:864)
>> at
>> ucar.nc2.dataset.NetcdfDataset.disambiguateHttp(NetcdfDataset.java:820)
>> at
>> ucar.nc2.dataset.NetcdfDataset.openOrAcquireFile(NetcdfDataset.java:706)
>> at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:427)
>> at ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:528)
>> at ucar.nc2.dt.grid.GridDataset.open(GridDataset.java:117)
>> at ucar.nc2.dt.grid.GridDataset.open(GridDataset.java:103)
>> at
>> tvl.bd.climate.recordreader.MyRecordReader.initialize(MyRecordReader.java:46)
>> at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:521)
>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: ucar.httpservices.HTTPException:
>> org.apache.http.conn.UnsupportedSchemeException: hdfs protocol is not
>> supported
>> at ucar.httpservices.HTTPSession.execute(HTTPSession.java:1136)
>> at ucar.httpservices.HTTPMethod.execute(HTTPMethod.java:326)
>> ... 16 more
>> Caused by: org.apache.http.conn.UnsupportedSchemeException: hdfs protocol is
>> not supported
>> at
>> org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:108)
>> at
>> org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
>> at
>> org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
>> at
>> org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
>> at
>> org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
>> at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
>> at
>> org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
>> at
>> org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
>> at
>> org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:71)
>> at ucar.httpservices.HTTPSession.execute(HTTPSession.java:1134)
>>
>>
>> Regards,
>>
>> Sashi
>>
>>
>