<%-- Document : index Created on : Sep 17, 2016, 6:42:16 AM Author : jbf --%> <%@page import="org.json.JSONException"%> <%@page import="java.util.Enumeration"%> <%@page import="java.io.File"%> <%@page import="org.das2.datum.DatumRange"%> <%@page import="org.json.JSONObject"%> <%@page import="org.json.JSONArray"%> <%@page import="org.autoplot.hapiserver.HapiServerSupport"%> <%@page import="org.autoplot.hapiserver.Util"%> <%@page contentType="text/html" pageEncoding="UTF-8"%> HAPI Server JSP Demo

This is a HAPI Server.

More information about this type of server is found at GitHub. This implementation of the HAPI server uses Autoplot URIs to load data, more information about Autoplot can be found here.
Run HAPI server verifier. <% Util.maybeInitialize( getServletContext() ); if ( Util.getHapiHome()==null ) { String HAPI_SERVER_HOME= getServletContext().getInitParameter("HAPI_SERVER_HOME"); Util.setHapiHome( new File( HAPI_SERVER_HOME ) ); } String ip = request.getRemoteAddr(); if (ip.equals("127.0.0.1")) { Enumeration hh= request.getHeaders("X-Forwarded-For"); if ( hh.hasMoreElements() ) { ip = hh.nextElement(); } } if ( ip.equals("127.0.0.1") || ip.equals("0:0:0:0:0:0:0:1") ) { String s= request.getRequestURI(); int i= s.indexOf("/",1); s= s.substring(0,i); out.println( String.format( "
This is run from localhost, set logging with SetLogLevel. ", s )); out.println( "Requests from localhost will have performance monitored, which can degrade performance.

"); } %>

Some example requests:

Catalog Show the catalog of available data sets.
Capabilities Capabilities of the server. For example, can it use binary streams to transfer data?

<% try { String HAPI_SERVER_HOME= getServletContext().getInitParameter("HAPI_SERVER_HOME"); Util.setHapiHome( new File( HAPI_SERVER_HOME ) ); JSONArray dss= HapiServerSupport.getCatalog(); for ( int i=0; i%s

", title ) ); if ( exampleRange!=null ) { out.println( String.format("Info Data", ds.getString("id"), ds.getString("id"), exampleRange.min().toString(), exampleRange.max().toString() ) ); } else { out.println( String.format("Info Data", ds.getString("id"), ds.getString("id") ) ); } out.println(" "); JSONArray parameters= info.getJSONArray("parameters"); for ( int j=0; j0 ) out.print(", "); try { out.print( parameters.getJSONObject(j).getString("name") ); } catch ( JSONException ex ) { out.print( "???" ); } } } } catch ( JSONException ex ) { out.print("

Something has gone wrong, see logs or send an email to faden at cottagesystems.com"); out.println("
"+out.toString()); } %>

<% long l= org.das2.qds.RecordIterator.TIME_STAMP; // load RecordIterator class first, or we'll get a negative time. %>
deployed <%= Util.getDurationForHumans( System.currentTimeMillis() - l ) %> ago
  • 2016-09-21: bugfix: parameters supported when include is not set.
  • 2016-09-26: more parameters in current conditions.
  • 2016-09-29: time ranges.
  • 2016-09-30: add spectrogram example.
  • 2016-10-03: correctly handle /hapi request, which redirects to /index.jsp.
  • 2016-10-04: add sample time range.
  • 2016-10-05: add power meter image spectrograms.
  • 2016-10-09: digits spectrogram is 27 channel spectrogram.
  • 2016-10-11: put in HAPI extension longDescription.
  • 2016-10-13: finish off support for streaming.
  • 2016-10-14: add a noStream version of one of the datasets, so that Autoplot can be used to compare.
  • 2016-10-16: add capabilities page. Properly handle empty datasets from readers.
  • 2016-10-25: add support for binary transfer
  • 2016-10-29: bugfix: binary assumed that times were in us2000.
  • 2016-10-31: bugfix: running app under different user showed that pylisting.txt was not available.
  • 2016-11-10: add titles to each item
  • 2016-11-11: bugfix: properly handle no granules of data found. Add forecast, which includes non-monotonic data.
  • 2016-11-15: add rain to forecast
  • 2016-11-21: new capabilities scheme
  • 2016-11-23: corrections to bugs found by Scott, 1717, like parameters=Spectra
  • 2017-01-04: add support for DOI and SPASE references in extra info.
  • 2017-01-08: add wind speed
  • 2017-01-10: use startDate and stopDate instead of firstDate and lastDate as decided on 2017-01-10 telecon.
  • 2017-02-03: add http link in info, to show support for this.
  • 2017-02-03: add demonstration ds for rank 3 data
  • 2017-02-07: use bins array instead of bins1, bins2.
  • 2017-02-13: use x_about instead of about.
  • 2017-02-21: work towards make the server externally configurable.
  • 2017-02-28: tweak the connection time for CDAWeb web services, add setLogLevel servlet.
  • 2017-03-04: use web.xml to set the initial location of the servlet data.
  • 2017-03-05: recent changes to support time-varying DEPEND_1 broke old codes and there was not sufficient testing to catch the mistake.
  • 2017-03-06: fix silly mistakes in untested changes. More silly mistakes.
  • 2017-03-15: allow data to come from csv files in data directory.
  • 2017-03-22: add experimental upload data capability.
  • 2017-03-28: finally support nominal data.
  • 2017-03-29: bugfix with include=header when cached file is used.
  • 2017-05-08: bugfix with binary, where the size used internally is now an array of ints, was probably not expected.
  • 2017-05-24: copy x_meta and resourceURI from templates.
  • 2017-06-07: support now-P1D/now for example time range.
  • 2017-06-08: catalog and capabilities responses can be cached.
  • 2017-06-09: bug in info response, where time was hard-coded and not getting changes in static file.
  • 2017-06-19: Bob's verifier caught that time lengths assumed that the string need not be null terminated. doubles used for return types.
  • 2017-06-20: Bob's verifier caught that streaming data sources were not trimmed to request time.
  • 2017-06-21: support for P1D/lastday added to DasCoreDatum, so that sample times are not always changing.
  • 2017-06-28: return 404 when ID is bad, instead of empty response. Bugfix, where streaming datasources would output an extra record. Bugfix, subset parameters in info request. Thanks, Bob!
  • 2017-08-14: add experimental caching mechanism, where HOME/hapi/cache can contain daily cache files. Cache is stored in .gzip form.
  • 2017-08-23: failed release was using old version, where format=binary would return ascii files from the cache. ascii would not properly subset.
  • 2017-11-06: put in new catch-all code on the landing page, to aid in debugging.
  • 2017-12-01: allow modification date to be "lastday" meaning the dataset was updated at midnight
  • 2017-12-02: various improvements to logging, and new class for monitoring output stream idle added. More improvements.
  • 2017-12-03: correct check for localhost.
  • 2018-02-13: add message to error message for info response. Update version declarations to HAPI 2.0.
  • 2018-10-23: correct invalid id response, thanks Bob's verifier.
  • 2019-02-26: correction to where two spectrograms could be served (specBins.2).
  • 2019-02-26: corrections prompted by HapiVerifier, such as x-deployedAt should be x_deployedAt.
  • 2019-03-05: make csv output more readable by checking for FORMAT property and using standard formatters.
  • 2019-03-27: correct bug with file handling when cache files are used and multiple days are requested.
  • 2019-04-24: when cache files are found, check dates to see if If-Modified-Since can be used to implement request.
  • 2019-09-30: correction for where time-varying channels are used.
  • 2020-01-21: update to 2.1. Bug when just "Time" was requested
  • 2020-03-02: javacsv library needed.