Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simple support for remote web stores #7325

Merged
Merged
Show file tree
Hide file tree
Changes from 66 commits
Commits
Show all changes
108 commits
Select commit Hold shift + click to select a range
c3db55b
Initial implementation
qqmyers Oct 13, 2020
e8c1578
null check on dateString
qqmyers Oct 13, 2020
00d53ee
adjust incoming identifier for HttpOverlay drivers
qqmyers Oct 14, 2020
94921bd
support overlay case
qqmyers Oct 14, 2020
cbdd35c
document need to update for overlay case
qqmyers Oct 14, 2020
11535bd
keep owner for getStorageIO call for HttpOverlay case
qqmyers Oct 14, 2020
1800575
typos
qqmyers Oct 14, 2020
239d5a8
debug logging
qqmyers Oct 14, 2020
e86c2d0
more logging
qqmyers Oct 14, 2020
0062c68
fix storageidentifier parsing/updating
qqmyers Oct 14, 2020
d6a5f65
more info about errors handled by ThrowableHandler
qqmyers Oct 14, 2020
d821b62
fine debug to show size
qqmyers Oct 14, 2020
1a8f0f1
actually instantiate an HttpClient !
qqmyers Oct 14, 2020
ad86e4c
algorithm fixes and logging
qqmyers Oct 14, 2020
4a9f209
log exception
qqmyers Oct 15, 2020
b339583
support auxPath for direct/overlay case
qqmyers Oct 15, 2020
5131e5e
create dir when needed for aux
qqmyers Oct 15, 2020
afa37ef
S3 flag to distinguish overlap and direct-upload cases
qqmyers Oct 15, 2020
6aaabe2
fix s3 storagelocation
qqmyers Oct 15, 2020
bd37c2e
Revert "fix s3 storagelocation"
qqmyers Oct 15, 2020
14a1196
fine logging
qqmyers Oct 15, 2020
e47eed7
fix storagelocation issues
qqmyers Oct 15, 2020
8497b2b
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Nov 3, 2020
5c8cb1a
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Nov 13, 2020
b253ab2
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Dec 10, 2020
140ffaa
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Jan 8, 2021
9b14433
Merge remote-tracking branch 'IQSS/develop' into
qqmyers Feb 23, 2021
e72c4e5
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Apr 7, 2021
0ea4cf9
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Apr 13, 2021
6fa5e90
Merge remote-tracking branch 'IQSS/develop' into
qqmyers May 20, 2021
257349a
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 4, 2021
41dedcb
format/cleanup
qqmyers Aug 5, 2021
7881a70
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Sep 3, 2021
e7ddf86
fix for get dataset logo with overlay store
qqmyers Sep 7, 2021
6b9cdef
update to check store type
qqmyers Sep 7, 2021
60d7d0d
refactor to support support addFiles api from #7901
qqmyers Sep 7, 2021
da133ec
refactor UI code
qqmyers Sep 7, 2021
c719a88
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Oct 13, 2021
76bfee2
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Jan 25, 2022
bbc7e32
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Feb 2, 2022
cc763f8
Merge remote-tracking branch 'IQSS/develop' into
qqmyers Mar 21, 2022
6dded83
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Mar 30, 2022
ce6bafe
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Apr 6, 2022
5a823ab
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Apr 14, 2022
c5246d2
Merge remote-tracking branch 'IQSS/develop' into
qqmyers Apr 29, 2022
7b68d57
Refactor to RemoteOverlay, use constants for store types/sep
qqmyers Apr 29, 2022
bebc275
refactor strings to RemoteOverlay
qqmyers Apr 29, 2022
edc9152
add basic support for remote tag/label in file table
qqmyers Apr 29, 2022
648ee1c
start doc changes
qqmyers Apr 29, 2022
570e97a
documentation, tweak to new branding property names
qqmyers Apr 29, 2022
7e590ad
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Apr 29, 2022
62b5488
typo
qqmyers Apr 29, 2022
e62a163
fix tabs in preexisting code
qqmyers Apr 29, 2022
3d3aab6
typos
qqmyers May 10, 2022
6bd92d6
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Jun 8, 2022
9133de7
cut/paste logic error re: remote tag
qqmyers Jun 8, 2022
1080031
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Jun 15, 2022
e8c3ed3
force lowercase for hash values - that's what is generated internally
qqmyers Jul 5, 2022
1bad2f3
log mismatched checksum values
qqmyers Jul 5, 2022
4441795
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Jul 25, 2022
37e2581
refactor for download redirect in remoteoverlaystore
qqmyers Jul 8, 2022
a401048
refactor to allow URL token substitution outside tools framework
qqmyers Jun 21, 2022
e23fb30
support passthrough for uploading files
qqmyers Jul 26, 2022
bcad012
Merge remote-tracking branch 'IQSS/develop' into
qqmyers Aug 2, 2022
c3db1ba
doc typo
qqmyers Aug 3, 2022
751a829
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 3, 2022
846d866
Apply suggestions from code review
qqmyers Aug 4, 2022
8c6b31a
switch to hyphens per review
qqmyers Aug 4, 2022
984254a
reduce variations on trusted remote store
qqmyers Aug 4, 2022
c3bbfec
add signer tests, flip param order so sign/validate match, fix val bug
qqmyers Aug 4, 2022
56f7676
update secret-key, cleanup
qqmyers Aug 5, 2022
1e4a724
Add tests/add support for local file base store tests
qqmyers Aug 5, 2022
5705e67
add an API test for local dev/testing #7324
pdurbin Aug 5, 2022
0902975
sign even for internal access
qqmyers Aug 5, 2022
ab90c16
Merge branch 'IQSS/7324_TRSA-HTTP-store' of https://github.com/Global…
qqmyers Aug 5, 2022
7e9d066
add some validation and test
qqmyers Aug 5, 2022
db4192e
typo in method name
qqmyers Aug 5, 2022
0b424f6
Merge branch 'develop' into IQSS/7324_TRSA-HTTP-store #7324
pdurbin Aug 8, 2022
c4eee7c
add curl example #7324
pdurbin Aug 8, 2022
c688e99
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 8, 2022
e40ea11
Merge branch 'IQSS/7324_TRSA-HTTP-store' of https://github.com/Global…
qqmyers Aug 8, 2022
0ce597a
Error handling or default on required params
qqmyers Aug 8, 2022
800eca2
sanity check to make sure driver being specified in addFile exists
qqmyers Aug 8, 2022
f730afa
only get value from json once
qqmyers Aug 8, 2022
1583788
update RemoteStoreIT test to show JVM options used #7324
pdurbin Aug 9, 2022
25b4059
add separate downloadRedirectEnabled for aux objects method
qqmyers Aug 9, 2022
504ca17
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 9, 2022
e6fb485
add logic to check base store download redirect for aux objects
qqmyers Aug 9, 2022
0fd56cf
minor error meg and comment changes
qqmyers Aug 9, 2022
085770a
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 10, 2022
361018f
remove cruft from tests #7324
pdurbin Aug 10, 2022
cee4f9d
Added a note about limitations of what's in the PR.
qqmyers Aug 10, 2022
c4f6fa5
Merge branch 'IQSS/7324_TRSA-HTTP-store' of https://github.com/Global…
qqmyers Aug 10, 2022
909b9c7
use single file API call /add
qqmyers Aug 16, 2022
5f633e4
copy non-globus parts from #8891 per review request
qqmyers Aug 16, 2022
cb1755d
add missing label
qqmyers Aug 16, 2022
7f990dc
Merge remote-tracking branch 'IQSS/develop' into IQSS/7324_TRSA-HTTP-…
qqmyers Aug 16, 2022
3d9418e
Handle null file size per QA discussion
qqmyers Aug 16, 2022
643b924
add checking w.r.t. dataset storage driver/base driver
qqmyers Aug 17, 2022
7301c62
add remote store in direct access to support sending file delete call
qqmyers Aug 17, 2022
0da52fc
typo
qqmyers Aug 17, 2022
45aa976
fix for delete
qqmyers Aug 17, 2022
94ffcbf
update to docs per QA
qqmyers Aug 17, 2022
708637d
keep remote and base identifiers in getStorageLocation, fix base config
qqmyers Aug 17, 2022
37bba52
add direct link to s3 call
qqmyers Aug 17, 2022
38856ef
fix base store config/related test that missed
qqmyers Aug 17, 2022
e72def0
Add test for bad remote URLs
qqmyers Aug 18, 2022
70a8b3b
note re 404 URLs
qqmyers Aug 18, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 33 additions & 3 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -238,13 +238,15 @@ As for the "Remote only" authentication mode, it means that:
- ``:DefaultAuthProvider`` has been set to use the desired authentication provider
- The "builtin" authentication provider has been disabled (:ref:`api-toggle-auth-provider`). Note that disabling the "builtin" authentication provider means that the API endpoint for converting an account from a remote auth provider will not work. Converting directly from one remote authentication provider to another (i.e. from GitHub to Google) is not supported. Conversion from remote is always to "builtin". Then the user initiates a conversion from "builtin" to remote. Note that longer term, the plan is to permit multiple login options to the same Dataverse installation account per https://github.com/IQSS/dataverse/issues/3487 (so all this talk of conversion will be moot) but for now users can only use a single login option, as explained in the :doc:`/user/account` section of the User Guide. In short, "remote only" might work for you if you only plan to use a single remote authentication provider such that no conversion between remote authentication providers will be necessary.

File Storage: Using a Local Filesystem and/or Swift and/or object stores
------------------------------------------------------------------------
File Storage: Using a Local Filesystem and/or Swift and/or object stores and/or trusted remote services
-------------------------------------------------------------------------------------------------------

By default, a Dataverse installation stores all data files (files uploaded by end users) on the filesystem at ``/usr/local/payara5/glassfish/domains/domain1/files``. This path can vary based on answers you gave to the installer (see the :ref:`dataverse-installer` section of the Installation Guide) or afterward by reconfiguring the ``dataverse.files.\<id\>.directory`` JVM option described below.

A Dataverse installation can alternately store files in a Swift or S3-compatible object store, and can now be configured to support multiple stores at once. With a multi-store configuration, the location for new files can be controlled on a per-Dataverse collection basis.

Dataverse may also be configured to reference some files (e.g. large and/or sensitive data) stored in a trusted remote web-accessible system.

The following sections describe how to set up various types of stores and how to configure for multiple stores.

Multi-store Basics
Expand Down Expand Up @@ -663,6 +665,34 @@ Migrating from Local Storage to S3

Is currently documented on the :doc:`/developers/deployment` page.

Trusted Remote Storage
++++++++++++++++++++++

In addition to having the type "remote" and requiring a label, Trusted Remote Stores are defined in terms of a baseURL - all files managed by this store must be at a path starting with this URL, and a baseStore - a file, s3, or swift store that can be used to store additional ancillary dataset files (e.g. metadata exports, thumbnails, auxiliary files, etc.).
These and other available options are described in the table below.

Remote stores can range from being a static trusted website to a sophisticated service managing access requests and logging activity
and/or managing access to a secure enclave. For specific remote stores, consult their documentation when configuring the remote store in Dataverse.

.. table::
:align: left

=========================================== ================== ========================================================================== =============
JVM Option Value Description Default value
=========================================== ================== ========================================================================== =============
dataverse.files.<id>.type ``remote`` **Required** to mark this storage as remote. (none)
dataverse.files.<id>.label <?> **Required** label to be shown in the UI for this storage (none)
dataverse.files.<id>.baseUrl <?> **Required** All files must have URLs of the form <baseUrl>/* (none)
dataverse.files.<id>.baseStore <?> **Required** The id of a base store (of type file, s3, or swift) (none)
dataverse.files.<id>.download-redirect ``true``/``false`` Enable direct download (should usually be true). ``false``
dataverse.files.<id>.secretKey <?> A key used to sign download requests sent to the remote store. Optional. (none)
dataverse.files.<id>.url-expiration-minutes <?> If direct downloads and using signing: time until links expire. Optional. 60
dataverse.files.<id>.remote-store-name <?> A short name used in the UI to indicate where a file is located. Optional (none)
dataverse.files.<id>.remote-store-url <?> A url to an info page about the remote store used in the UI. Optional. (none)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple of these descriptions don't end in a period.


=========================================== ================== ========================================================================== =============



.. _Branding Your Installation:

Expand Down Expand Up @@ -2568,7 +2598,7 @@ Number of errors to display to the user when creating DataFiles from a file uplo
.. _:BagItHandlerEnabled:

:BagItHandlerEnabled
+++++++++++++++++++++
++++++++++++++++++++

Part of the database settings to configure the BagIt file handler. Enables the BagIt file handler. By default, the handler is disabled.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@
public abstract class DvObjectContainer extends DvObject {


//Default to "file" is for tests only
public static final String UNDEFINED_METADATA_LANGUAGE_CODE = "undefined"; //Used in dataverse.xhtml as a non-null selection option value (indicating inheriting the default)


Expand Down
10 changes: 7 additions & 3 deletions src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -1933,7 +1933,7 @@ private void handleReplaceFileUpload(String fullStorageLocation,

fileReplacePageHelper.resetReplaceFileHelper();
saveEnabled = false;
String storageIdentifier = DataAccess.getStorarageIdFromLocation(fullStorageLocation);
String storageIdentifier = DataAccess.getStorageIdFromLocation(fullStorageLocation);
if (fileReplacePageHelper.handleNativeFileUpload(null, storageIdentifier, fileName, contentType, checkSumValue, checkSumType)) {
saveEnabled = true;

Expand Down Expand Up @@ -2078,8 +2078,12 @@ public void handleExternalUpload() {
if (!checksumTypeString.isBlank()) {
checksumType = ChecksumType.fromString(checksumTypeString);
}

//Should only be one colon with curent design
int lastColon = fullStorageIdentifier.lastIndexOf(':');
String storageLocation = fullStorageIdentifier.substring(0, lastColon) + "/" + dataset.getAuthorityForFileStorage() + "/" + dataset.getIdentifierForFileStorage() + "/" + fullStorageIdentifier.substring(lastColon + 1);
String storageLocation = fullStorageIdentifier.substring(0,lastColon) + "/" + dataset.getAuthorityForFileStorage() + "/" + dataset.getIdentifierForFileStorage() + "/" + fullStorageIdentifier.substring(lastColon+1);
storageLocation = DataAccess.expandStorageIdentifierIfNeeded(storageLocation);

if (uploadInProgress.isFalse()) {
uploadInProgress.setValue(true);
}
Expand Down Expand Up @@ -3044,7 +3048,7 @@ public void saveAdvancedOptions() {
}

public boolean rsyncUploadSupported() {
// ToDo - rsync was written before multiple store support and currently is hardcoded to use the "s3" store.
// ToDo - rsync was written before multiple store support and currently is hardcoded to use the DataAccess.S3 store.
// When those restrictions are lifted/rsync can be configured per store, the test in the
// Dataset Util method should be updated
if (settingsWrapper.isRsyncUpload() && !DatasetUtil.isAppropriateStorageDriver(dataset)) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -560,12 +560,12 @@ public void addFileToCustomZipJob(String key, DataFile dataFile, Timestamp times

public String getDirectStorageLocatrion(String storageLocation) {
String storageDriverId;
int separatorIndex = storageLocation.indexOf("://");
int separatorIndex = storageLocation.indexOf(DataAccess.SEPARATOR);
if ( separatorIndex > 0 ) {
storageDriverId = storageLocation.substring(0,separatorIndex);

String storageType = DataAccess.getDriverType(storageDriverId);
if ("file".equals(storageType) || "s3".equals(storageType)) {
if (DataAccess.FILE.equals(storageType) || DataAccess.S3.equals(storageType)) {
return storageType.concat(storageLocation.substring(separatorIndex));
}
}
Expand Down
78 changes: 39 additions & 39 deletions src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
Original file line number Diff line number Diff line change
Expand Up @@ -1908,7 +1908,7 @@ public Response receiveChecksumValidationResults(@PathParam("identifier") String
String message = wr.getMessage();
return error(Response.Status.INTERNAL_SERVER_ERROR, "Uploaded files have passed checksum validation but something went wrong while attempting to put the files into Dataverse. Message was '" + message + "'.");
}
} else if(storageDriverType.equals("s3")) {
} else if(storageDriverType.equals(DataAccess.S3)) {

logger.log(Level.INFO, "S3 storage driver used for DCM (dataset id={0})", dataset.getId());
try {
Expand Down Expand Up @@ -2371,40 +2371,40 @@ public Response addFileToDataset(@PathParam("id") String idSupplied,
String newFilename = null;
String newFileContentType = null;
String newStorageIdentifier = null;
if (null == contentDispositionHeader) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for replacing all these tabs with spaces!

if (optionalFileParams.hasStorageIdentifier()) {
newStorageIdentifier = optionalFileParams.getStorageIdentifier();
// ToDo - check that storageIdentifier is valid
if (optionalFileParams.hasFileName()) {
newFilename = optionalFileParams.getFileName();
if (optionalFileParams.hasMimetype()) {
newFileContentType = optionalFileParams.getMimeType();
}
}
} else {
return error(BAD_REQUEST,
"You must upload a file or provide a storageidentifier, filename, and mimetype.");
}
} else {
newFilename = contentDispositionHeader.getFileName();
// Let's see if the form data part has the mime (content) type specified.
// Note that we don't want to rely on formDataBodyPart.getMediaType() -
// because that defaults to "text/plain" when no "Content-Type:" header is
// present. Instead we'll go through the headers, and see if "Content-Type:"
// is there. If not, we'll default to "application/octet-stream" - the generic
// unknown type. This will prompt the application to run type detection and
// potentially find something more accurate.
//newFileContentType = formDataBodyPart.getMediaType().toString();

for (String header : formDataBodyPart.getHeaders().keySet()) {
if (header.equalsIgnoreCase("Content-Type")) {
newFileContentType = formDataBodyPart.getHeaders().get(header).get(0);
}
}
if (newFileContentType == null) {
newFileContentType = FileUtil.MIME_TYPE_UNDETERMINED_DEFAULT;
}
}
if (null == contentDispositionHeader) {
if (optionalFileParams.hasStorageIdentifier()) {
newStorageIdentifier = optionalFileParams.getStorageIdentifier();
newStorageIdentifier = DataAccess.expandStorageIdentifierIfNeeded(newStorageIdentifier);
if (optionalFileParams.hasFileName()) {
newFilename = optionalFileParams.getFileName();
if (optionalFileParams.hasMimetype()) {
newFileContentType = optionalFileParams.getMimeType();
}
}
} else {
return error(BAD_REQUEST,
"You must upload a file or provide a storageidentifier, filename, and mimetype.");
}
} else {
newFilename = contentDispositionHeader.getFileName();
// Let's see if the form data part has the mime (content) type specified.
// Note that we don't want to rely on formDataBodyPart.getMediaType() -
// because that defaults to "text/plain" when no "Content-Type:" header is
// present. Instead we'll go through the headers, and see if "Content-Type:"
// is there. If not, we'll default to "application/octet-stream" - the generic
// unknown type. This will prompt the application to run type detection and
// potentially find something more accurate.
// newFileContentType = formDataBodyPart.getMediaType().toString();

for (String header : formDataBodyPart.getHeaders().keySet()) {
if (header.equalsIgnoreCase("Content-Type")) {
newFileContentType = formDataBodyPart.getHeaders().get(header).get(0);
}
}
if (newFileContentType == null) {
newFileContentType = FileUtil.MIME_TYPE_UNDETERMINED_DEFAULT;
}
}


//-------------------
Expand Down Expand Up @@ -2912,7 +2912,7 @@ public Response setFileStore(@PathParam("identifier") String dvIdtf,
}
if (!user.isSuperuser()) {
return error(Response.Status.FORBIDDEN, "Superusers only.");
}
}

Dataset dataset;

Expand All @@ -2930,7 +2930,7 @@ public Response setFileStore(@PathParam("identifier") String dvIdtf,
return ok("Storage driver set to: " + store.getKey() + "/" + store.getValue());
}
}
return error(Response.Status.BAD_REQUEST,
return error(Response.Status.BAD_REQUEST,
"No Storage Driver found for : " + storageDriverLabel);
}

Expand All @@ -2948,7 +2948,7 @@ public Response resetFileStore(@PathParam("identifier") String dvIdtf,
}
if (!user.isSuperuser()) {
return error(Response.Status.FORBIDDEN, "Superusers only.");
}
}

Dataset dataset;

Expand All @@ -2960,7 +2960,7 @@ public Response resetFileStore(@PathParam("identifier") String dvIdtf,

dataset.setStorageDriverId(null);
datasetService.merge(dataset);
return ok("Storage reset to default: " + DataAccess.DEFAULT_STORAGE_DRIVER_IDENTIFIER);
return ok("Storage reset to default: " + DataAccess.DEFAULT_STORAGE_DRIVER_IDENTIFIER);
}

@GET
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ public void writeTo(DownloadInstance di, Class<?> clazz, Type type, Annotation[]

// Before we do anything else, check if this download can be handled
// by a redirect to remote storage (only supported on S3, as of 5.4):
if (storageIO instanceof S3AccessIO && ((S3AccessIO) storageIO).downloadRedirectEnabled()) {
if (storageIO.downloadRedirectEnabled()) {

// Even if the above is true, there are a few cases where a
// redirect is not applicable.
Expand Down Expand Up @@ -188,16 +188,15 @@ public void writeTo(DownloadInstance di, Class<?> clazz, Type type, Annotation[]
// [attempt to] redirect:
String redirect_url_str;
try {
redirect_url_str = ((S3AccessIO) storageIO).generateTemporaryS3Url(auxiliaryTag, auxiliaryType, auxiliaryFileName);
redirect_url_str = storageIO.generateTemporaryDownloadUrl(auxiliaryTag, auxiliaryType, auxiliaryFileName);
} catch (IOException ioex) {
logger.warning("Unable to generate downloadURL for " + dataFile.getId() + ": " + auxiliaryTag);
//Setting null will let us try to get the file/aux file w/o redirecting
redirect_url_str = null;
}

if (redirect_url_str == null) {
throw new ServiceUnavailableException();
}

logger.fine("Data Access API: direct S3 url: " + redirect_url_str);

URI redirect_uri;

try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -222,6 +222,7 @@ public JsonResponseBuilder log(Logger logger, Level level, Optional<Throwable> e
metadata.deleteCharAt(metadata.length()-1);

if (ex.isPresent()) {
ex.get().printStackTrace();
metadata.append("|");
logger.log(level, metadata.toString(), ex);
if(includeStackTrace) {
Expand Down
Loading