Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge main into oss-docs and fix related bootstrap and test failures #137

Merged
merged 7 commits into from
Feb 25, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -34,6 +34,44 @@ Feature requests
If you find yourself wishing for a feature that doesn't exist in Elasticsearch, you are probably not alone. There are bound to be others out there with similar needs. Many of the features that Elasticsearch has today have been added because our users saw the need.
Open an issue on our [issues list](https://github.com/elastic/elasticsearch/issues) on GitHub which describes the feature you would like to see, why you need it, and how it should work.


## Sign your work
The sign-off is a simple line at the end of each commit, which certifies that you wrote it or otherwise have the right to pass it on as an open-source patch. if you can certify the below
```
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
```
then you just add a line to every git commit message:
```
Signed-off-by: Bob Sanders <bob.sanders@email.com>
```
You can sign off your work easily by adding the configuration in github
```
git config user.name "Bob Sanders"
git config user.email "bob.sanders@email.com"
```
Then, you could sign off commits automatically by adding `-s` or `-=signoff` parameter to your usual git commits commands. e.g.
```
git commit -s -m "my first commit"
```

Contributing code and documentation changes
-------------------------------------------

6 changes: 4 additions & 2 deletions Jenkinsfile
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
pipeline {
agent any
agent { label 'search-cloud-ec2-c518xlarge' }


stages {
stage('Build') {
steps {
echo 'Building..'
sh './gradlew check --no-daemon'
// Disable backward compability tasks
sh './gradlew check --no-daemon --no-scan -Pbwc_tests_enabled=false'
}
}
stage('Test') {
5 changes: 3 additions & 2 deletions build.gradle
Original file line number Diff line number Diff line change
@@ -170,8 +170,9 @@ tasks.register("verifyVersions") {
* after the backport of the backcompat code is complete.
*/

boolean bwc_tests_enabled = true
final String bwc_tests_disabled_issue = "" /* place a PR link here when committing bwc changes */
boolean bwc_tests_enabled = false
/* place a PR link here when committing bwc changes */
final String bwc_tests_disabled_issue = "https://github.com/opendistro-for-elasticsearch/search/issues/105"
if (bwc_tests_enabled == false) {
if (bwc_tests_disabled_issue.isEmpty()) {
throw new GradleException("bwc_tests_disabled_issue must be set when bwc_tests_enabled == false")
4 changes: 0 additions & 4 deletions buildSrc/build.gradle
Original file line number Diff line number Diff line change
@@ -171,12 +171,8 @@ if (project != rootProject) {

dependencies {
reaper project('reaper')
distribution project(':distribution:archives:windows-zip')
distribution project(':distribution:archives:oss-windows-zip')
distribution project(':distribution:archives:darwin-tar')
distribution project(':distribution:archives:oss-darwin-tar')
distribution project(':distribution:archives:linux-aarch64-tar')
distribution project(':distribution:archives:linux-tar')
distribution project(':distribution:archives:oss-linux-tar')
distribution project(':distribution:archives:oss-linux-aarch64-tar')

Original file line number Diff line number Diff line change
@@ -108,7 +108,7 @@ class DistributionDownloadPluginFuncTest extends AbstractGradleFuncTest {

then:
result.tasks.size() == 3
result.output.count("Unpacking elasticsearch-${version}-linux-x86_64.tar.gz " +
result.output.count("Unpacking elasticsearch-oss-${version}-linux-x86_64.tar.gz " +
"using SymbolicLinkPreservingUntarTransform.") == 1
}

@@ -155,4 +155,4 @@ class DistributionDownloadPluginFuncTest extends AbstractGradleFuncTest {
}
"""
}
}
}
Original file line number Diff line number Diff line change
@@ -112,6 +112,7 @@ abstract class AbstractGradleFuncTest extends Specification {
}

void setupLocalGitRepo() {
//TODO: cleanup
execute("git init")
execute('git config user.email "build-tool@elastic.co"')
execute('git config user.name "Build tool"')
Original file line number Diff line number Diff line change
@@ -54,7 +54,7 @@ class DistributionDownloadFixture {
private static String urlPath(String version,ElasticsearchDistribution.Platform platform) {
String fileType = ((platform == ElasticsearchDistribution.Platform.LINUX ||
platform == ElasticsearchDistribution.Platform.DARWIN)) ? "tar.gz" : "zip"
"/downloads/elasticsearch/elasticsearch-${version}-${platform}-x86_64.$fileType"
"/downloads/elasticsearch/elasticsearch-oss-${version}-${platform}-x86_64.$fileType"
}

private static byte[] filebytes(String urlPath) throws IOException {
Original file line number Diff line number Diff line change
@@ -33,7 +33,7 @@ class InternalDistributionArchiveSetupPluginFuncTest extends AbstractGradleFuncT
def setup() {
buildFile << """
import org.elasticsearch.gradle.tar.SymbolicLinkPreservingTar
plugins {
id 'elasticsearch.internal-distribution-archive-setup'
}
@@ -60,7 +60,6 @@ class InternalDistributionArchiveSetupPluginFuncTest extends AbstractGradleFuncT

where:
buildTaskName | expectedOutputArchivePath
"buildDarwinTar" | "darwin-tar/build/distributions/elasticsearch.tar.gz"
"buildOssDarwinTar" | "oss-darwin-tar/build/distributions/elasticsearch-oss.tar.gz"
}

@@ -82,7 +81,6 @@ class InternalDistributionArchiveSetupPluginFuncTest extends AbstractGradleFuncT

where:
buildTaskName | expectedOutputArchivePath
"buildDarwinZip" | "darwin-zip/build/distributions/elasticsearch.zip"
"buildOssDarwinZip" | "oss-darwin-zip/build/distributions/elasticsearch-oss.zip"
}

@@ -111,23 +109,23 @@ class InternalDistributionArchiveSetupPluginFuncTest extends AbstractGradleFuncT
}
}
}
project('consumer') { p ->
configurations {
consumeArchive {}
consumeDir {}
}
dependencies {
consumeDir project(path: ':producer-tar', configuration:'extracted')
consumeArchive project(path: ':producer-tar', configuration:'default' )
}
tasks.register("copyDir", Copy) {
from(configurations.consumeDir)
into('build/dir')
}
tasks.register("copyArchive", Copy) {
from(configurations.consumeArchive)
into('build/archives')
@@ -140,8 +138,8 @@ class InternalDistributionArchiveSetupPluginFuncTest extends AbstractGradleFuncT
then: "tar task executed and target folder contains plain tar"
result.task(':buildProducerTar').outcome == TaskOutcome.SUCCESS
result.task(':consumer:copyArchive').outcome == TaskOutcome.SUCCESS
file("producer-tar/build/distributions/elasticsearch.tar.gz").exists()
file("consumer/build/archives/elasticsearch.tar.gz").exists()
file("producer-tar/build/distributions/elasticsearch-oss.tar.gz").exists()
file("consumer/build/archives/elasticsearch-oss.tar.gz").exists()

when:
result = gradleRunner("copyDir", "-Pversion=1.0").build()
Original file line number Diff line number Diff line change
@@ -46,32 +46,29 @@ class InternalDistributionBwcSetupPluginFuncTest extends AbstractGradleFuncTest
def "builds distribution from branches via archives assemble"() {
when:
def result = gradleRunner(new File(testProjectDir.root, "remote"),
":distribution:bwc:bugfix:buildBwcDarwinTar",
":distribution:bwc:bugfix:buildBwcOssDarwinTar",
"-DtestRemoteRepo=" + remoteGitRepo,
"-Dbwc.remote=origin")
.build()
then:
result.task(":distribution:bwc:bugfix:buildBwcDarwinTar").outcome == TaskOutcome.SUCCESS
result.task(":distribution:bwc:bugfix:buildBwcOssDarwinTar").outcome == TaskOutcome.SUCCESS

and: "assemble task triggered"
result.output.contains("[8.0.1] > Task :distribution:archives:darwin-tar:assemble")
result.output.contains("[8.0.1] > Task :distribution:archives:oss-darwin-tar:assemble")
}

def "bwc distribution archives can be resolved as bwc project artifact"() {
setup:
new File(testProjectDir.root, 'remote/build.gradle') << """
configurations {
dists
}
dependencies {
dists project(path: ":distribution:bwc:bugfix", configuration:"darwin-tar")
dists project(path: ":distribution:bwc:bugfix", configuration:"oss-darwin-tar")
}
tasks.register("resolveDistributionArchive") {
inputs.files(configurations.dists)
doLast {
@@ -89,27 +86,27 @@ class InternalDistributionBwcSetupPluginFuncTest extends AbstractGradleFuncTest
.build()
then:
result.task(":resolveDistributionArchive").outcome == TaskOutcome.SUCCESS
result.task(":distribution:bwc:bugfix:buildBwcDarwinTar").outcome == TaskOutcome.SUCCESS
result.task(":distribution:bwc:bugfix:buildBwcOssDarwinTar").outcome == TaskOutcome.SUCCESS

and: "assemble task triggered"
result.output.contains("[8.0.1] > Task :distribution:archives:darwin-tar:assemble")
result.output.contains("[8.0.1] > Task :distribution:archives:oss-darwin-tar:assemble")
normalizedOutput(result.output)
.contains("distfile /distribution/bwc/bugfix/build/bwc/checkout-8.0/distribution/archives/darwin-tar/" +
"build/distributions/elasticsearch-8.0.1-SNAPSHOT-darwin-x86_64.tar.gz")
.contains("distfile /distribution/bwc/bugfix/build/bwc/checkout-8.0/distribution/archives/oss-darwin-tar/" +
"build/distributions/elasticsearch-oss-8.0.1-SNAPSHOT-darwin-x86_64.tar.gz")
}

def "bwc expanded distribution folder can be resolved as bwc project artifact"() {
setup:
new File(testProjectDir.root, 'remote/build.gradle') << """
configurations {
expandedDist
}
dependencies {
expandedDist project(path: ":distribution:bwc:bugfix", configuration:"expanded-darwin-tar")
expandedDist project(path: ":distribution:bwc:bugfix", configuration:"expanded-oss-darwin-tar")
}
tasks.register("resolveExpandedDistribution") {
inputs.files(configurations.expandedDist)
doLast {
@@ -127,13 +124,13 @@ class InternalDistributionBwcSetupPluginFuncTest extends AbstractGradleFuncTest
.build()
then:
result.task(":resolveExpandedDistribution").outcome == TaskOutcome.SUCCESS
result.task(":distribution:bwc:bugfix:buildBwcDarwinTar").outcome == TaskOutcome.SUCCESS
result.task(":distribution:bwc:bugfix:buildBwcOssDarwinTar").outcome == TaskOutcome.SUCCESS

and: "assemble task triggered"
result.output.contains("[8.0.1] > Task :distribution:archives:darwin-tar:assemble")
result.output.contains("[8.0.1] > Task :distribution:archives:oss-darwin-tar:assemble")
normalizedOutput(result.output)
.contains("distfile /distribution/bwc/bugfix/build/bwc/checkout-8.0/" +
"distribution/archives/darwin-tar/build/install")
"distribution/archives/oss-darwin-tar/build/install")
}

File setupGitRemote() {
Original file line number Diff line number Diff line change
@@ -72,7 +72,7 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest
def result = gradleRunner("setupDistro", '-g', testProjectDir.newFolder('GUH').path).build()

then:
result.task(":distribution:archives:linux-tar:buildExpanded").outcome == TaskOutcome.SUCCESS
result.task(":distribution:archives:oss-linux-tar:buildExpanded").outcome == TaskOutcome.SUCCESS
result.task(":setupDistro").outcome == TaskOutcome.SUCCESS
assertExtractedDistroIsCreated("build/distro", 'current-marker.txt')
}
@@ -144,24 +144,24 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest
apply plugin:'base'
// packed distro
configurations.create("linux-tar")
configurations.create("oss-linux-tar")
tasks.register("buildBwcTask", Tar) {
from('bwc-marker.txt')
archiveExtension = "tar.gz"
compression = Compression.GZIP
}
artifacts {
it.add("linux-tar", buildBwcTask)
it.add("oss-linux-tar", buildBwcTask)
}
// expanded distro
configurations.create("expanded-linux-tar")
configurations.create("expanded-oss-linux-tar")
def expandedTask = tasks.register("buildBwcExpandedTask", Copy) {
from('bwc-marker.txt')
into('build/install/elastic-distro')
}
artifacts {
it.add("expanded-linux-tar", file('build/install')) {
it.add("expanded-oss-linux-tar", file('build/install')) {
builtBy expandedTask
type = 'directory'
}
@@ -171,9 +171,9 @@ class InternalDistributionDownloadPluginFuncTest extends AbstractGradleFuncTest

private void localDistroSetup() {
settingsFile << """
include ":distribution:archives:linux-tar"
include ":distribution:archives:oss-linux-tar"
"""
def bwcSubProjectFolder = testProjectDir.newFolder("distribution", "archives", "linux-tar")
def bwcSubProjectFolder = testProjectDir.newFolder("distribution", "archives", "oss-linux-tar")
new File(bwcSubProjectFolder, 'current-marker.txt') << "current"
new File(bwcSubProjectFolder, 'build.gradle') << """
import org.gradle.api.internal.artifacts.ArtifactAttributes;
Original file line number Diff line number Diff line change
@@ -23,7 +23,7 @@ subprojects {
tasks.register('tar', Tar) {
from('.')
destinationDirectory.set(file('build/distributions'))
archiveBaseName.set("elasticsearch${project.name.startsWith('oss')?'-oss':''}")
archiveBaseName.set("elasticsearch-oss")
archiveVersion.set("8.0.1-SNAPSHOT")
archiveClassifier.set("darwin-x86_64")
archiveExtension.set('tar.gz')
Original file line number Diff line number Diff line change
@@ -19,5 +19,4 @@

include ":distribution:bwc:bugfix"
include ":distribution:bwc:minor"
include ":distribution:archives:darwin-tar"
include ":distribution:archives:oss-darwin-tar"
Original file line number Diff line number Diff line change
@@ -23,8 +23,6 @@ import org.elasticsearch.gradle.Version
import org.elasticsearch.gradle.VersionProperties
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.Task
import org.gradle.api.tasks.TaskProvider

/**
* Sets up tests for documentation.
@@ -37,7 +35,7 @@ class DocsTestPlugin implements Plugin<Project> {
project.pluginManager.apply('elasticsearch.standalone-rest-test')
project.pluginManager.apply('elasticsearch.rest-test')

String distribution = System.getProperty('tests.distribution', 'default')
String distribution = System.getProperty('tests.distribution', 'oss')
// The distribution can be configured with -Dtests.distribution on the command line
project.testClusters.integTest.testDistribution = distribution.toUpperCase()
project.testClusters.integTest.nameCustomization = { it.replace("integTest", "node") }
@@ -51,7 +49,6 @@ class DocsTestPlugin implements Plugin<Project> {
'\\{version\\}': Version.fromString(VersionProperties.elasticsearch).toString(),
'\\{version_qualified\\}': VersionProperties.elasticsearch,
'\\{lucene_version\\}' : VersionProperties.lucene.replaceAll('-snapshot-\\w+$', ''),
'\\{build_flavor\\}' : distribution,
'\\{build_type\\}' : OS.conditionalString().onWindows({"zip"}).onUnix({"tar"}).supply(),
]
project.tasks.register('listSnippets', SnippetsTask) {
Original file line number Diff line number Diff line change
@@ -193,35 +193,20 @@ class ClusterFormationTasks {
}
return
}
// TEMP HACK
// The oss docs CI build overrides the distro on the command line. This hack handles backcompat until CI is updated.
if (distro.equals('oss-zip')) {
distro = 'oss'
}
if (distro.equals('zip')) {
distro = 'default'
}
// END TEMP HACK
if (['oss', 'default'].contains(distro) == false) {
throw new GradleException("Unknown distribution: ${distro} in project ${project.path}")
}
distro = 'oss'

Version version = Version.fromString(elasticsearchVersion)
String os = getOs()
String classifier = "-${os}-x86_64"
String packaging = os.equals('windows') ? 'zip' : 'tar.gz'
String artifactName = 'elasticsearch'
if (distro.equals('oss') && Version.fromString(elasticsearchVersion).onOrAfter('6.3.0')) {
artifactName += '-oss'
}
String artifactName = 'elasticsearch-oss'
Object dependency
String snapshotProject = "${os}-${os.equals('windows') ? 'zip' : 'tar'}"
if (version.before("7.0.0")) {
snapshotProject = "zip"
packaging = "zip"
}
if (distro.equals("oss")) {
snapshotProject = "oss-" + snapshotProject
}
snapshotProject = "oss-" + snapshotProject

BwcVersions.UnreleasedVersionInfo unreleasedInfo = null

Original file line number Diff line number Diff line change
@@ -19,7 +19,6 @@

package org.elasticsearch.gradle;

import org.elasticsearch.gradle.ElasticsearchDistribution.Flavor;
import org.elasticsearch.gradle.ElasticsearchDistribution.Platform;
import org.elasticsearch.gradle.ElasticsearchDistribution.Type;
import org.elasticsearch.gradle.docker.DockerSupportPlugin;
@@ -38,7 +37,6 @@
import org.gradle.api.provider.Provider;

import java.util.Comparator;
import static org.elasticsearch.gradle.util.Util.capitalize;

/**
* A plugin to manage getting and extracting distributions of Elasticsearch.
@@ -193,43 +191,8 @@ private String dependencyNotation(ElasticsearchDistribution distribution) {
} else if (distribution.getType() == Type.RPM && distroVersion.before("7.0.0")) {
classifier = "";
}
String flavor = "";
if (distribution.getFlavor() == Flavor.OSS && distroVersion.onOrAfter("6.3.0")) {
flavor = "-oss";
}

String group = distribution.getVersion().endsWith("-SNAPSHOT") ? FAKE_SNAPSHOT_IVY_GROUP : FAKE_IVY_GROUP;
return group + ":elasticsearch" + flavor + ":" + distribution.getVersion() + classifier + "@" + extension;
}

private static String configName(String prefix, ElasticsearchDistribution distribution) {
return String.format(
"%s_%s_%s_%s%s%s",
prefix,
distribution.getVersion(),
distribution.getType(),
distribution.getPlatform() == null ? "" : distribution.getPlatform() + "_",
distribution.getFlavor(),
distribution.getBundledJdk() ? "" : "_nojdk"
);
}

private static String extractTaskName(ElasticsearchDistribution distribution) {
String taskName = "extractElasticsearch";
if (distribution.getType() != Type.INTEG_TEST_ZIP) {
if (distribution.getFlavor() == Flavor.OSS) {
taskName += "Oss";
}
if (distribution.getBundledJdk() == false) {
taskName += "NoJdk";
}
}
if (distribution.getType() == Type.ARCHIVE) {
taskName += capitalize(distribution.getPlatform().toString());
} else if (distribution.getType() != Type.INTEG_TEST_ZIP) {
taskName += capitalize(distribution.getType().toString());
}
taskName += distribution.getVersion();
return taskName;
return group + ":elasticsearch-oss" + ":" + distribution.getVersion() + classifier + "@" + extension;
}
}
Original file line number Diff line number Diff line change
@@ -23,9 +23,7 @@
* This class models the different Docker base images that are used to build Docker distributions of Elasticsearch.
*/
public enum DockerBase {
CENTOS("centos:8"),
// "latest" here is intentional, since the image name specifies "8"
UBI("docker.elastic.co/ubi8/ubi-minimal:latest");
CENTOS("centos:8");

private final String image;

Original file line number Diff line number Diff line change
@@ -50,9 +50,7 @@ public enum Type {
ARCHIVE,
RPM,
DEB,
DOCKER,
// This is a different flavour of Docker image
DOCKER_UBI;
DOCKER;

@Override
public String toString() {
@@ -63,7 +61,6 @@ public boolean shouldExtract() {
switch (this) {
case DEB:
case DOCKER:
case DOCKER_UBI:
case RPM:
return false;

@@ -73,16 +70,6 @@ public boolean shouldExtract() {
}
}

public enum Flavor {
DEFAULT,
OSS;

@Override
public String toString() {
return super.toString().toLowerCase(Locale.ROOT);
}
}

// package private to tests can use
public static final Platform CURRENT_PLATFORM = OS.<Platform>conditional()
.onLinux(() -> Platform.LINUX)
@@ -99,7 +86,6 @@ public String toString() {
private final Property<String> version;
private final Property<Type> type;
private final Property<Platform> platform;
private final Property<Flavor> flavor;
private final Property<Boolean> bundledJdk;
private final Property<Boolean> failIfUnavailable;
private final Configuration extracted;
@@ -119,7 +105,6 @@ public String toString() {
this.type = objectFactory.property(Type.class);
this.type.convention(Type.ARCHIVE);
this.platform = objectFactory.property(Platform.class);
this.flavor = objectFactory.property(Flavor.class);
this.bundledJdk = objectFactory.property(Boolean.class);
this.failIfUnavailable = objectFactory.property(Boolean.class).convention(true);
this.extracted = extractedConfiguration;
@@ -154,21 +139,13 @@ public void setType(Type type) {
this.type.set(type);
}

public Flavor getFlavor() {
return flavor.getOrNull();
}

public void setFlavor(Flavor flavor) {
this.flavor.set(flavor);
}

public boolean getBundledJdk() {
return bundledJdk.getOrElse(true);
}

public boolean isDocker() {
final Type type = this.type.get();
return type == Type.DOCKER || type == Type.DOCKER_UBI;
return type == Type.DOCKER;
}

public void setBundledJdk(Boolean bundledJdk) {
@@ -204,7 +181,6 @@ public Configuration getExtracted() {
switch (getType()) {
case DEB:
case DOCKER:
case DOCKER_UBI:
case RPM:
throw new UnsupportedOperationException(
"distribution type [" + getType() + "] for " + "elasticsearch distribution [" + name + "] cannot be extracted"
@@ -239,11 +215,7 @@ void finalizeValues() {
"platform cannot be set on elasticsearch distribution [" + name + "] of type [integ_test_zip]"
);
}
if (flavor.getOrNull() != null) {
throw new IllegalArgumentException(
"flavor [" + flavor.get() + "] not allowed for elasticsearch distribution [" + name + "] of type [integ_test_zip]"
);
}

if (bundledJdk.getOrNull() != null) {
throw new IllegalArgumentException(
"bundledJdk cannot be set on elasticsearch distribution [" + name + "] of type [integ_test_zip]"
@@ -275,23 +247,16 @@ void finalizeValues() {
"bundledJdk cannot be set on elasticsearch distribution [" + name + "] of type " + "[docker]"
);
}
if (flavor.get() == Flavor.OSS && type.get() == Type.DOCKER_UBI) {
throw new IllegalArgumentException("Cannot build a UBI docker image for the OSS distribution");
}
}
}

if (flavor.isPresent() == false) {
flavor.set(Flavor.DEFAULT);
}
if (bundledJdk.isPresent() == false) {
bundledJdk.set(true);
}

version.finalizeValue();
platform.finalizeValue();
type.finalizeValue();
flavor.finalizeValue();
bundledJdk.finalizeValue();
}
}
Original file line number Diff line number Diff line change
@@ -79,6 +79,7 @@ public static void configureRepositories(Project project) {
throw new GradleException("Malformed lucene snapshot version: " + luceneVersion);
}
String revision = matcher.group(1);
// TODO(cleanup) - Setup own lucene snapshot repo
MavenArtifactRepository luceneRepo = repos.maven(repo -> {
repo.setName("lucene-snapshots");
repo.setUrl("https://s3.amazonaws.com/download.elasticsearch.org/lucenesnapshots/" + revision);
Original file line number Diff line number Diff line change
@@ -108,7 +108,7 @@ private void configureGeneralTaskDefaults(Project project) {
project.getTasks().withType(AbstractArchiveTask.class).configureEach(t -> {
String subdir = archiveTaskToSubprojectName(t.getName());
t.getDestinationDirectory().set(project.file(subdir + "/build/distributions"));
t.getArchiveBaseName().set(subdir.contains("oss") ? "elasticsearch-oss" : "elasticsearch");
t.getArchiveBaseName().set("elasticsearch-oss");
});
}

Original file line number Diff line number Diff line change
@@ -119,7 +119,7 @@ private void registerBwcArtifacts(Project bwcProject, DistributionProject distri

private void registerDistributionArchiveArtifact(Project bwcProject, DistributionProject distributionProject, String buildBwcTask) {
String artifactFileName = distributionProject.getDistFile().getName();
String artifactName = artifactFileName.contains("oss") ? "elasticsearch-oss" : "elasticsearch";
String artifactName = "elasticsearch-oss";

String suffix = artifactFileName.endsWith("tar.gz") ? "tar.gz" : artifactFileName.substring(artifactFileName.length() - 3);
int archIndex = artifactFileName.indexOf("x86_64");
@@ -142,12 +142,12 @@ private void registerDistributionArchiveArtifact(Project bwcProject, Distributio
private static List<DistributionProject> resolveArchiveProjects(File checkoutDir, Version bwcVersion) {
List<String> projects = new ArrayList<>();
// All active BWC branches publish default and oss variants of rpm and deb packages
projects.addAll(asList("deb", "rpm", "oss-deb", "oss-rpm"));
projects.addAll(asList("oss-deb", "oss-rpm"));

if (bwcVersion.onOrAfter("7.0.0")) { // starting with 7.0 we bundle a jdk which means we have platform-specific archives
projects.addAll(asList("oss-windows-zip", "windows-zip", "oss-darwin-tar", "darwin-tar", "oss-linux-tar", "linux-tar"));
projects.addAll(asList("oss-windows-zip", "oss-darwin-tar", "oss-linux-tar"));
} else { // prior to 7.0 we published only a single zip and tar archives for oss and default distributions
projects.addAll(asList("oss-zip", "zip", "tar", "oss-tar"));
projects.addAll(asList("oss-zip", "oss-tar"));
}

return projects.stream().map(name -> {
@@ -157,7 +157,7 @@ private static List<DistributionProject> resolveArchiveProjects(File checkoutDir
if (bwcVersion.onOrAfter("7.0.0")) {
if (name.contains("zip") || name.contains("tar")) {
int index = name.lastIndexOf('-');
String baseName = name.startsWith("oss-") ? name.substring(4, index) : name.substring(0, index);
String baseName = name.substring(4, index); // oss-
classifier = "-" + baseName + "-x86_64";
extension = name.substring(index + 1);
if (extension.equals("tar")) {
@@ -168,7 +168,7 @@ private static List<DistributionProject> resolveArchiveProjects(File checkoutDir
} else if (name.contains("rpm")) {
classifier = "-x86_64";
}
} else if (name.contains("oss-")) {
} else {
extension = name.substring(4);
}
return new DistributionProject(name, baseDir, bwcVersion, classifier, extension, checkoutDir);
@@ -228,16 +228,7 @@ private static class DistributionProject {
this.projectPath = baseDir + "/" + name;
this.distFile = new File(
checkoutDir,
baseDir
+ "/"
+ name
+ "/build/distributions/elasticsearch-"
+ (name.startsWith("oss") ? "oss-" : "")
+ version
+ "-SNAPSHOT"
+ classifier
+ "."
+ extension
baseDir + "/" + name + "/build/distributions/elasticsearch-oss-" + version + "-SNAPSHOT" + classifier + "." + extension
);
// we only ported this down to the 7.x branch.
if (version.onOrAfter("7.10.0") && (name.endsWith("zip") || name.endsWith("tar"))) {
Original file line number Diff line number Diff line change
@@ -127,7 +127,6 @@ private static String distributionProjectPath(ElasticsearchDistribution distribu
break;

case DOCKER:
case DOCKER_UBI:
projectPath += ":docker:";
projectPath += distributionProjectName(distribution);
break;
@@ -155,9 +154,7 @@ private static String distributionProjectName(ElasticsearchDistribution distribu
? ""
: "-" + architecture.toString().toLowerCase();

if (distribution.getFlavor() == ElasticsearchDistribution.Flavor.OSS) {
projectName += "oss-";
}
projectName += "oss-";

if (distribution.getBundledJdk() == false) {
projectName += "no-jdk-";
@@ -169,18 +166,14 @@ private static String distributionProjectName(ElasticsearchDistribution distribu
? "-zip"
: "-tar");
} else {
projectName = distribution.getFlavor().equals(ElasticsearchDistribution.Flavor.DEFAULT) ? "zip" : "oss-zip";
projectName = "oss-zip";
}
break;

case DOCKER:
projectName += "docker" + archString + "-export";
break;

case DOCKER_UBI:
projectName += "ubi-docker" + archString + "-export";
break;

default:
projectName += distribution.getType();
break;
Original file line number Diff line number Diff line change
@@ -22,7 +22,6 @@
import org.elasticsearch.gradle.Architecture;
import org.elasticsearch.gradle.DistributionDownloadPlugin;
import org.elasticsearch.gradle.ElasticsearchDistribution;
import org.elasticsearch.gradle.ElasticsearchDistribution.Flavor;
import org.elasticsearch.gradle.ElasticsearchDistribution.Platform;
import org.elasticsearch.gradle.ElasticsearchDistribution.Type;
import org.elasticsearch.gradle.Jdk;
@@ -110,24 +109,28 @@ public void apply(Project project) {
TaskProvider<?> depsTask = project.getTasks().register(taskname + "#deps");
depsTask.configure(t -> t.dependsOn(distribution, examplePlugin));
depsTasks.put(taskname, depsTask);
TaskProvider<Test> destructiveTask = configureTestTask(project, taskname, distribution, t -> {
t.onlyIf(t2 -> distribution.isDocker() == false || dockerSupport.get().getDockerAvailability().isAvailable);
addSysprop(t, DISTRIBUTION_SYSPROP, distribution::getFilepath);
addSysprop(t, EXAMPLE_PLUGIN_SYSPROP, () -> examplePlugin.getSingleFile().toString());
t.exclude("**/PackageUpgradeTests.class");
}, depsTask);

if (distribution.getPlatform() == Platform.WINDOWS) {
windowsTestTasks.add(destructiveTask);
} else {
linuxTestTasks.computeIfAbsent(distribution.getType(), k -> new ArrayList<>()).add(destructiveTask);
// TODO - suppressing failure temporarily where duplicate tasks are created for docker.
try {
TaskProvider<Test> destructiveTask = configureTestTask(project, taskname, distribution, t -> {
t.onlyIf(t2 -> distribution.isDocker() == false || dockerSupport.get().getDockerAvailability().isAvailable);
addSysprop(t, DISTRIBUTION_SYSPROP, distribution::getFilepath);
addSysprop(t, EXAMPLE_PLUGIN_SYSPROP, () -> examplePlugin.getSingleFile().toString());
t.exclude("**/PackageUpgradeTests.class");
}, depsTask);
if (distribution.getPlatform() == Platform.WINDOWS) {
windowsTestTasks.add(destructiveTask);
} else {
linuxTestTasks.computeIfAbsent(distribution.getType(), k -> new ArrayList<>()).add(destructiveTask);
}
destructiveDistroTest.configure(t -> t.dependsOn(destructiveTask));
lifecycleTasks.get(distribution.getType()).configure(t -> t.dependsOn(destructiveTask));
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
destructiveDistroTest.configure(t -> t.dependsOn(destructiveTask));
lifecycleTasks.get(distribution.getType()).configure(t -> t.dependsOn(destructiveTask));

if ((distribution.getType() == Type.DEB || distribution.getType() == Type.RPM) && distribution.getBundledJdk()) {
for (Version version : BuildParams.getBwcVersions().getIndexCompatible()) {
if (distribution.getFlavor() == Flavor.OSS && version.before("6.3.0")) {
if (version.before("6.3.0")) {
continue; // before opening xpack
}
final ElasticsearchDistribution bwcDistro;
@@ -140,7 +143,6 @@ public void apply(Project project) {
distribution.getArchitecture(),
distribution.getType(),
distribution.getPlatform(),
distribution.getFlavor(),
distribution.getBundledJdk(),
version.toString()
);
@@ -206,8 +208,7 @@ public void apply(Project project) {
// auto-detection doesn't work.
//
// The shouldTestDocker property could be null, hence we use Boolean.TRUE.equals()
boolean shouldExecute = (type != Type.DOCKER && type != Type.DOCKER_UBI)
|| Boolean.TRUE.equals(vmProject.findProperty("shouldTestDocker"));
boolean shouldExecute = (type != Type.DOCKER) || Boolean.TRUE.equals(vmProject.findProperty("shouldTestDocker"));

if (shouldExecute) {
distroTest.configure(t -> t.dependsOn(wrapperTask));
@@ -234,7 +235,6 @@ private static Map<ElasticsearchDistribution.Type, TaskProvider<?>> lifecycleTas
Map<ElasticsearchDistribution.Type, TaskProvider<?>> lifecyleTasks = new HashMap<>();

lifecyleTasks.put(Type.DOCKER, project.getTasks().register(taskPrefix + ".docker"));
lifecyleTasks.put(Type.DOCKER_UBI, project.getTasks().register(taskPrefix + ".ubi"));
lifecyleTasks.put(Type.ARCHIVE, project.getTasks().register(taskPrefix + ".archives"));
lifecyleTasks.put(Type.DEB, project.getTasks().register(taskPrefix + ".packages"));
lifecyleTasks.put(Type.RPM, lifecyleTasks.get(Type.DEB));
@@ -363,55 +363,38 @@ private List<ElasticsearchDistribution> configureDistributions(Project project)
List<ElasticsearchDistribution> currentDistros = new ArrayList<>();

for (Architecture architecture : Architecture.values()) {
for (Type type : Arrays.asList(Type.DEB, Type.RPM, Type.DOCKER, Type.DOCKER_UBI)) {
for (Flavor flavor : Flavor.values()) {
for (boolean bundledJdk : Arrays.asList(true, false)) {
if (bundledJdk == false) {
// We'll never publish an ARM (aarch64) build without a bundled JDK.
if (architecture == Architecture.AARCH64) {
continue;
}
// All our Docker images include a bundled JDK so it doesn't make sense to test without one.
if (type == Type.DOCKER || type == Type.DOCKER_UBI) {
continue;
}
for (Type type : Arrays.asList(Type.DEB, Type.RPM, Type.DOCKER)) {
for (boolean bundledJdk : Arrays.asList(true, false)) {
if (bundledJdk == false) {
// We'll never publish an ARM (aarch64) build without a bundled JDK.
if (architecture == Architecture.AARCH64) {
continue;
}

// We don't publish the OSS distribution on UBI
if (type == Type.DOCKER_UBI && flavor == Flavor.OSS) {
// All our Docker images include a bundled JDK so it doesn't make sense to test without one.
if (type == Type.DOCKER) {
continue;
}

currentDistros.add(
createDistro(distributions, architecture, type, null, flavor, bundledJdk, VersionProperties.getElasticsearch())
);
}

currentDistros.add(
createDistro(distributions, architecture, type, null, bundledJdk, VersionProperties.getElasticsearch())
);
}
}
}

for (Architecture architecture : Architecture.values()) {
for (Platform platform : Arrays.asList(Platform.LINUX, Platform.WINDOWS)) {
for (Flavor flavor : Flavor.values()) {
for (boolean bundledJdk : Arrays.asList(true, false)) {
if (bundledJdk == false && architecture != Architecture.X64) {
// We will never publish distributions for non-x86 (amd64) platforms
// without a bundled JDK
continue;
}

currentDistros.add(
createDistro(
distributions,
architecture,
Type.ARCHIVE,
platform,
flavor,
bundledJdk,
VersionProperties.getElasticsearch()
)
);
for (boolean bundledJdk : Arrays.asList(true, false)) {
if (bundledJdk == false && architecture != Architecture.X64) {
// We will never publish distributions for non-x86 (amd64) platforms
// without a bundled JDK
continue;
}

currentDistros.add(
createDistro(distributions, architecture, Type.ARCHIVE, platform, bundledJdk, VersionProperties.getElasticsearch())
);
}
}
}
@@ -424,15 +407,13 @@ private static ElasticsearchDistribution createDistro(
Architecture architecture,
Type type,
Platform platform,
Flavor flavor,
boolean bundledJdk,
String version
) {
String name = distroId(type, platform, flavor, bundledJdk, architecture) + "-" + version;
boolean isDocker = type == Type.DOCKER || type == Type.DOCKER_UBI;
String name = distroId(type, platform, bundledJdk, architecture) + "-" + version;
boolean isDocker = type == Type.DOCKER;
ElasticsearchDistribution distro = distributions.create(name, d -> {
d.setArchitecture(architecture);
d.setFlavor(flavor);
d.setType(type);
if (type == Type.ARCHIVE) {
d.setPlatform(platform);
@@ -457,27 +438,23 @@ private static boolean isWindows(Project project) {
return project.getName().contains("windows");
}

private static String distroId(Type type, Platform platform, Flavor flavor, boolean bundledJdk, Architecture architecture) {
return flavor
+ "-"
+ (type == Type.ARCHIVE ? platform + "-" : "")
+ type
+ (bundledJdk ? "" : "-no-jdk")
+ (architecture == Architecture.X64 ? "" : "-" + architecture.toString().toLowerCase());
private static String distroId(Type type, Platform platform, boolean bundledJdk, Architecture architecture) {
return (type == Type.ARCHIVE ? platform + "-" : "") + type + (bundledJdk ? "" : "-no-jdk") + (architecture == Architecture.X64
? ""
: "-" + architecture.toString().toLowerCase());
}

private static String destructiveDistroTestTaskName(ElasticsearchDistribution distro) {
Type type = distro.getType();
return "destructiveDistroTest."
+ distroId(type, distro.getPlatform(), distro.getFlavor(), distro.getBundledJdk(), distro.getArchitecture());
return "destructiveDistroTest." + distroId(type, distro.getPlatform(), distro.getBundledJdk(), distro.getArchitecture());
}

private static String destructiveDistroUpgradeTestTaskName(ElasticsearchDistribution distro, String bwcVersion) {
Type type = distro.getType();
return "destructiveDistroUpgradeTest.v"
+ bwcVersion
+ "."
+ distroId(type, distro.getPlatform(), distro.getFlavor(), distro.getBundledJdk(), distro.getArchitecture());
+ distroId(type, distro.getPlatform(), distro.getBundledJdk(), distro.getArchitecture());
}

private static void addSysprop(Test task, String sysprop, Supplier<String> valueSupplier) {
Original file line number Diff line number Diff line change
@@ -267,16 +267,10 @@ private void setDistributionType(ElasticsearchDistribution distribution, TestDis
if (testDistribution == TestDistribution.INTEG_TEST) {
distribution.setType(ElasticsearchDistribution.Type.INTEG_TEST_ZIP);
// we change the underlying distribution when changing the test distribution of the cluster.
distribution.setFlavor(null);
distribution.setPlatform(null);
distribution.setBundledJdk(null);
} else {
distribution.setType(ElasticsearchDistribution.Type.ARCHIVE);
if (testDistribution == TestDistribution.DEFAULT) {
distribution.setFlavor(ElasticsearchDistribution.Flavor.DEFAULT);
} else {
distribution.setFlavor(ElasticsearchDistribution.Flavor.OSS);
}
}
}

Original file line number Diff line number Diff line change
@@ -19,7 +19,6 @@

package org.elasticsearch.gradle;

import org.elasticsearch.gradle.ElasticsearchDistribution.Flavor;
import org.elasticsearch.gradle.ElasticsearchDistribution.Platform;
import org.elasticsearch.gradle.ElasticsearchDistribution.Type;
import org.elasticsearch.gradle.info.BuildParams;
@@ -63,15 +62,7 @@ public class DistributionDownloadPluginTests extends GradleUnitTestCase {
);

public void testVersionDefault() {
ElasticsearchDistribution distro = checkDistro(
createProject(null, false),
"testdistro",
null,
Type.ARCHIVE,
Platform.LINUX,
Flavor.OSS,
true
);
ElasticsearchDistribution distro = checkDistro(createProject(null, false), "testdistro", null, Type.ARCHIVE, Platform.LINUX, true);
assertEquals(distro.getVersion(), VersionProperties.getElasticsearch());
}

@@ -82,35 +73,18 @@ public void testBadVersionFormat() {
"badversion",
Type.ARCHIVE,
Platform.LINUX,
Flavor.OSS,
true,
"Invalid version format: 'badversion'"
);
}

public void testTypeDefault() {
ElasticsearchDistribution distro = checkDistro(
createProject(null, false),
"testdistro",
"5.0.0",
null,
Platform.LINUX,
Flavor.OSS,
true
);
ElasticsearchDistribution distro = checkDistro(createProject(null, false), "testdistro", "5.0.0", null, Platform.LINUX, true);
assertEquals(distro.getType(), Type.ARCHIVE);
}

public void testPlatformDefault() {
ElasticsearchDistribution distro = checkDistro(
createProject(null, false),
"testdistro",
"5.0.0",
Type.ARCHIVE,
null,
Flavor.OSS,
true
);
ElasticsearchDistribution distro = checkDistro(createProject(null, false), "testdistro", "5.0.0", Type.ARCHIVE, null, true);
assertEquals(distro.getPlatform(), ElasticsearchDistribution.CURRENT_PLATFORM);
}

@@ -122,45 +96,17 @@ public void testPlatformForIntegTest() {
Type.INTEG_TEST_ZIP,
Platform.LINUX,
null,
null,
"platform cannot be set on elasticsearch distribution [testdistro]"
);
}

public void testFlavorDefault() {
ElasticsearchDistribution distro = checkDistro(
createProject(null, false),
"testdistro",
"5.0.0",
Type.ARCHIVE,
Platform.LINUX,
null,
true
);
assertEquals(distro.getFlavor(), Flavor.DEFAULT);
}

public void testFlavorForIntegTest() {
assertDistroError(
createProject(null, false),
"testdistro",
"5.0.0",
Type.INTEG_TEST_ZIP,
null,
Flavor.OSS,
null,
"flavor [oss] not allowed for elasticsearch distribution [testdistro] of type [integ_test_zip]"
);
}

public void testBundledJdkDefault() {
ElasticsearchDistribution distro = checkDistro(
createProject(null, false),
"testdistro",
"5.0.0",
Type.ARCHIVE,
Platform.LINUX,
null,
true
);
assertTrue(distro.getBundledJdk());
@@ -173,7 +119,6 @@ public void testBundledJdkForIntegTest() {
"5.0.0",
Type.INTEG_TEST_ZIP,
null,
null,
true,
"bundledJdk cannot be set on elasticsearch distribution [testdistro]"
);
@@ -184,70 +129,62 @@ public void testLocalCurrentVersionIntegTestZip() {
Project archiveProject = ProjectBuilder.builder().withParent(archivesProject).withName("integ-test-zip").build();
archiveProject.getConfigurations().create("default");
archiveProject.getArtifacts().add("default", new File("doesnotmatter"));
createDistro(project, "distro", VersionProperties.getElasticsearch(), Type.INTEG_TEST_ZIP, null, null, null);
createDistro(project, "distro", VersionProperties.getElasticsearch(), Type.INTEG_TEST_ZIP, null, null);
checkPlugin(project);
}

public void testLocalCurrentVersionArchives() {
for (Platform platform : Platform.values()) {
for (Flavor flavor : Flavor.values()) {
for (boolean bundledJdk : new boolean[] { true, false }) {
// create a new project in each iteration, so that we know we are resolving the only additional project being created
Project project = createProject(BWC_MINOR, true);
String projectName = projectName(platform.toString(), flavor, bundledJdk);
projectName += (platform == Platform.WINDOWS ? "-zip" : "-tar");
Project archiveProject = ProjectBuilder.builder().withParent(archivesProject).withName(projectName).build();
archiveProject.getConfigurations().create("default");
archiveProject.getArtifacts().add("default", new File("doesnotmatter"));
createDistro(project, "distro", VersionProperties.getElasticsearch(), Type.ARCHIVE, platform, flavor, bundledJdk);
checkPlugin(project);
}
for (boolean bundledJdk : new boolean[] { true, false }) {
// create a new project in each iteration, so that we know we are resolving the only additional project being created
Project project = createProject(BWC_MINOR, true);
String projectName = projectName(platform.toString(), bundledJdk);
projectName += (platform == Platform.WINDOWS ? "-zip" : "-tar");
Project archiveProject = ProjectBuilder.builder().withParent(archivesProject).withName(projectName).build();
archiveProject.getConfigurations().create("default");
archiveProject.getArtifacts().add("default", new File("doesnotmatter"));
createDistro(project, "distro", VersionProperties.getElasticsearch(), Type.ARCHIVE, platform, bundledJdk);
checkPlugin(project);
}
}
}

public void testLocalCurrentVersionPackages() {
for (Type packageType : new Type[] { Type.RPM, Type.DEB }) {
for (Flavor flavor : Flavor.values()) {
for (boolean bundledJdk : new boolean[] { true, false }) {
Project project = createProject(BWC_MINOR, true);
String projectName = projectName(packageType.toString(), flavor, bundledJdk);
Project packageProject = ProjectBuilder.builder().withParent(packagesProject).withName(projectName).build();
packageProject.getConfigurations().create("default");
packageProject.getArtifacts().add("default", new File("doesnotmatter"));
createDistro(project, "distro", VersionProperties.getElasticsearch(), packageType, null, flavor, bundledJdk);
checkPlugin(project);
}
for (boolean bundledJdk : new boolean[] { true, false }) {
Project project = createProject(BWC_MINOR, true);
String projectName = projectName(packageType.toString(), bundledJdk);
Project packageProject = ProjectBuilder.builder().withParent(packagesProject).withName(projectName).build();
packageProject.getConfigurations().create("default");
packageProject.getArtifacts().add("default", new File("doesnotmatter"));
createDistro(project, "distro", VersionProperties.getElasticsearch(), packageType, null, bundledJdk);
checkPlugin(project);
}
}
}

public void testLocalBwcArchives() {
for (Platform platform : Platform.values()) {
for (Flavor flavor : Flavor.values()) {
// note: no non bundled jdk for bwc
String configName = projectName(platform.toString(), flavor, true);
configName += (platform == Platform.WINDOWS ? "-zip" : "-tar");
// note: no non bundled jdk for bwc
String configName = projectName(platform.toString(), true);
configName += (platform == Platform.WINDOWS ? "-zip" : "-tar");

checkBwc("minor", configName, BWC_MINOR_VERSION, Type.ARCHIVE, platform, flavor, BWC_MINOR, true);
checkBwc("staged", configName, BWC_STAGED_VERSION, Type.ARCHIVE, platform, flavor, BWC_STAGED, true);
checkBwc("bugfix", configName, BWC_BUGFIX_VERSION, Type.ARCHIVE, platform, flavor, BWC_BUGFIX, true);
checkBwc("maintenance", configName, BWC_MAINTENANCE_VERSION, Type.ARCHIVE, platform, flavor, BWC_MAINTENANCE, true);
}
checkBwc("minor", configName, BWC_MINOR_VERSION, Type.ARCHIVE, platform, BWC_MINOR, true);
checkBwc("staged", configName, BWC_STAGED_VERSION, Type.ARCHIVE, platform, BWC_STAGED, true);
checkBwc("bugfix", configName, BWC_BUGFIX_VERSION, Type.ARCHIVE, platform, BWC_BUGFIX, true);
checkBwc("maintenance", configName, BWC_MAINTENANCE_VERSION, Type.ARCHIVE, platform, BWC_MAINTENANCE, true);
}
}

public void testLocalBwcPackages() {
for (Type packageType : new Type[] { Type.RPM, Type.DEB }) {
for (Flavor flavor : Flavor.values()) {
// note: no non bundled jdk for bwc
String configName = projectName(packageType.toString(), flavor, true);
// note: no non bundled jdk for bwc
String configName = projectName(packageType.toString(), true);

checkBwc("minor", configName, BWC_MINOR_VERSION, packageType, null, flavor, BWC_MINOR, true);
checkBwc("staged", configName, BWC_STAGED_VERSION, packageType, null, flavor, BWC_STAGED, true);
checkBwc("bugfix", configName, BWC_BUGFIX_VERSION, packageType, null, flavor, BWC_BUGFIX, true);
checkBwc("maintenance", configName, BWC_MAINTENANCE_VERSION, packageType, null, flavor, BWC_MAINTENANCE, true);
}
checkBwc("minor", configName, BWC_MINOR_VERSION, packageType, null, BWC_MINOR, true);
checkBwc("staged", configName, BWC_STAGED_VERSION, packageType, null, BWC_STAGED, true);
checkBwc("bugfix", configName, BWC_BUGFIX_VERSION, packageType, null, BWC_BUGFIX, true);
checkBwc("maintenance", configName, BWC_MAINTENANCE_VERSION, packageType, null, BWC_MAINTENANCE, true);
}
}

@@ -257,13 +194,12 @@ private void assertDistroError(
String version,
Type type,
Platform platform,
Flavor flavor,
Boolean bundledJdk,
String message
) {
IllegalArgumentException e = expectThrows(
IllegalArgumentException.class,
() -> checkDistro(project, name, version, type, platform, flavor, bundledJdk)
() -> checkDistro(project, name, version, type, platform, bundledJdk)
);
assertThat(e.getMessage(), containsString(message));
}
@@ -274,7 +210,6 @@ private ElasticsearchDistribution createDistro(
String version,
Type type,
Platform platform,
Flavor flavor,
Boolean bundledJdk
) {
NamedDomainObjectContainer<ElasticsearchDistribution> distros = DistributionDownloadPlugin.getContainer(project);
@@ -288,9 +223,6 @@ private ElasticsearchDistribution createDistro(
if (platform != null) {
distro.setPlatform(platform);
}
if (flavor != null) {
distro.setFlavor(flavor);
}
if (bundledJdk != null) {
distro.setBundledJdk(bundledJdk);
}
@@ -304,10 +236,9 @@ private ElasticsearchDistribution checkDistro(
String version,
Type type,
Platform platform,
Flavor flavor,
Boolean bundledJdk
) {
ElasticsearchDistribution distribution = createDistro(project, name, version, type, platform, flavor, bundledJdk);
ElasticsearchDistribution distribution = createDistro(project, name, version, type, platform, bundledJdk);
distribution.finalizeValues();
return distribution;
}
@@ -324,15 +255,14 @@ private void checkBwc(
Version version,
Type type,
Platform platform,
Flavor flavor,
BwcVersions bwcVersions,
boolean isInternal
) {
Project project = createProject(bwcVersions, isInternal);
Project archiveProject = ProjectBuilder.builder().withParent(bwcProject).withName(projectName).build();
archiveProject.getConfigurations().create(config);
archiveProject.getArtifacts().add(config, new File("doesnotmatter"));
createDistro(project, "distro", version.toString(), type, platform, flavor, true);
createDistro(project, "distro", version.toString(), type, platform, true);
checkPlugin(project);
}

@@ -351,11 +281,8 @@ private Project createProject(BwcVersions bwcVersions, boolean isInternal) {
return project;
}

private static String projectName(String base, Flavor flavor, boolean bundledJdk) {
String prefix = "";
if (flavor == Flavor.OSS) {
prefix += "oss-";
}
private static String projectName(String base, boolean bundledJdk) {
String prefix = "oss-";
if (bundledJdk == false) {
prefix += "no-jdk-";
}
Original file line number Diff line number Diff line change
@@ -102,14 +102,13 @@ public static class Version {
private static final ConstructingObjectParser<Version, Void> PARSER =
new ConstructingObjectParser<>(Version.class.getName(), true,
args -> {
return new Version((String) args[0], (String) args[1], (String) args[2], (String) args[3], (String) args[4],
(Boolean) args[5], (String) args[6], (String) args[7], (String) args[8]);
return new Version((String) args[0], (String) args[1], (String) args[2], (String) args[3],
(Boolean) args[4], (String) args[5], (String) args[6], (String) args[7]);
}
);

static {
PARSER.declareString(ConstructingObjectParser.constructorArg(), new ParseField("number"));
PARSER.declareString(ConstructingObjectParser.optionalConstructorArg(), new ParseField("build_flavor"));
PARSER.declareString(ConstructingObjectParser.optionalConstructorArg(), new ParseField("build_type"));
PARSER.declareString(ConstructingObjectParser.constructorArg(), new ParseField("build_hash"));
PARSER.declareString(ConstructingObjectParser.constructorArg(), new ParseField("build_date"));
@@ -118,8 +117,8 @@ public static class Version {
PARSER.declareString(ConstructingObjectParser.constructorArg(), new ParseField("minimum_wire_compatibility_version"));
PARSER.declareString(ConstructingObjectParser.constructorArg(), new ParseField("minimum_index_compatibility_version"));
}

private final String number;
private final String buildFlavor;
private final String buildType;
private final String buildHash;
private final String buildDate;
@@ -128,10 +127,9 @@ public static class Version {
private final String minimumWireCompatibilityVersion;
private final String minimumIndexCompatibilityVersion;

public Version(String number, String buildFlavor, String buildType, String buildHash, String buildDate, boolean isSnapshot,
String luceneVersion, String minimumWireCompatibilityVersion, String minimumIndexCompatibilityVersion) {
public Version(String number, String buildType, String buildHash, String buildDate, boolean isSnapshot,
String luceneVersion, String minimumWireCompatibilityVersion, String minimumIndexCompatibilityVersion) {
this.number = number;
this.buildFlavor = buildFlavor;
this.buildType = buildType;
this.buildHash = buildHash;
this.buildDate = buildDate;
@@ -145,10 +143,6 @@ public String getNumber() {
return number;
}

public String getBuildFlavor() {
return buildFlavor;
}

public String getBuildType() {
return buildType;
}
@@ -184,7 +178,6 @@ public boolean equals(Object o) {
Version version = (Version) o;
return isSnapshot == version.isSnapshot &&
number.equals(version.number) &&
Objects.equals(buildFlavor, version.buildFlavor) &&
Objects.equals(buildType, version.buildType) &&
buildHash.equals(version.buildHash) &&
buildDate.equals(version.buildDate) &&
@@ -195,7 +188,7 @@ public boolean equals(Object o) {

@Override
public int hashCode() {
return Objects.hash(number, buildFlavor, buildType, buildHash, buildDate, isSnapshot, luceneVersion,
return Objects.hash(number, buildType, buildHash, buildDate, isSnapshot, luceneVersion,
minimumWireCompatibilityVersion, minimumIndexCompatibilityVersion);
}
}
Original file line number Diff line number Diff line change
@@ -42,7 +42,6 @@ public void testInfo() throws IOException {
assertNotNull(info.getNodeName());
@SuppressWarnings("unchecked")
Map<String, Object> versionMap = (Map<String, Object>) infoAsMap.get("version");
assertEquals(versionMap.get("build_flavor"), info.getVersion().getBuildFlavor());
assertEquals(versionMap.get("build_type"), info.getVersion().getBuildType());
assertEquals(versionMap.get("build_hash"), info.getVersion().getBuildHash());
assertEquals(versionMap.get("build_date"), info.getVersion().getBuildDate());
Original file line number Diff line number Diff line change
@@ -165,7 +165,7 @@ public void testPingSocketTimeout() throws IOException {
}

public void testInfo() throws IOException {
MainResponse testInfo = new MainResponse("nodeName", new MainResponse.Version("number", "buildFlavor", "buildType", "buildHash",
MainResponse testInfo = new MainResponse("nodeName", new MainResponse.Version("number", "buildType", "buildHash",
"buildDate", true, "luceneVersion", "minimumWireCompatibilityVersion", "minimumIndexCompatibilityVersion"),
"clusterName", "clusterUuid", "You Know, for Search");
mockResponse((ToXContentFragment) (builder, params) -> {
@@ -175,7 +175,6 @@ public void testInfo() throws IOException {
builder.field("cluster_uuid", testInfo.getClusterUuid());
builder.startObject("version")
.field("number", testInfo.getVersion().getNumber())
.field("build_flavor", testInfo.getVersion().getBuildFlavor())
.field("build_type", testInfo.getVersion().getBuildType())
.field("build_hash", testInfo.getVersion().getBuildHash())
.field("build_date", testInfo.getVersion().getBuildDate())
Original file line number Diff line number Diff line change
@@ -41,7 +41,7 @@ protected org.elasticsearch.action.main.MainResponse createServerTestInstance(XC
final String date = new Date(randomNonNegativeLong()).toString();
Version version = VersionUtils.randomVersionBetween(random(), Version.V_6_0_1, Version.CURRENT);
Build build = new Build(
Build.Flavor.UNKNOWN, Build.Type.UNKNOWN, randomAlphaOfLength(8), date, randomBoolean(),
Build.Type.UNKNOWN, randomAlphaOfLength(8), date, randomBoolean(),
version.toString()
);
return new org.elasticsearch.action.main.MainResponse(nodeName, version, clusterName, clusterUuid , build);
@@ -62,7 +62,6 @@ protected void assertInstances(org.elasticsearch.action.main.MainResponse server
assertThat(serverTestInstance.getBuild().hash(), equalTo(clientInstance.getVersion().getBuildHash()));
assertThat(serverTestInstance.getVersion().toString(), equalTo(clientInstance.getVersion().getNumber()));
assertThat(serverTestInstance.getBuild().date(), equalTo(clientInstance.getVersion().getBuildDate()));
assertThat(serverTestInstance.getBuild().flavor().displayName(), equalTo(clientInstance.getVersion().getBuildFlavor()));
assertThat(serverTestInstance.getBuild().type().displayName(), equalTo(clientInstance.getVersion().getBuildType()));
assertThat(serverTestInstance.getVersion().luceneVersion.toString(), equalTo(clientInstance.getVersion().getLuceneVersion()));
assertThat(serverTestInstance.getVersion().minimumIndexCompatibilityVersion().toString(),
Original file line number Diff line number Diff line change
@@ -46,7 +46,6 @@ public void testMain() throws IOException {
String nodeName = response.getNodeName();
MainResponse.Version version = response.getVersion();
String buildDate = version.getBuildDate();
String buildFlavor = version.getBuildFlavor();
String buildHash = version.getBuildHash();
String buildType = version.getBuildType();
String luceneVersion = version.getLuceneVersion();
@@ -59,7 +58,6 @@ public void testMain() throws IOException {
assertNotNull(nodeName);
assertNotNull(version);
assertNotNull(buildDate);
assertNotNull(buildFlavor);
assertNotNull(buildHash);
assertNotNull(buildType);
assertNotNull(luceneVersion);
80 changes: 14 additions & 66 deletions distribution/archives/build.gradle
Original file line number Diff line number Diff line change
@@ -17,28 +17,25 @@
* under the License.
*/

import java.nio.file.Files
import java.nio.file.Path

apply plugin: 'elasticsearch.internal-distribution-archive-setup'

CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String platform, String architecture, boolean oss, boolean jdk) {
CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String platform, String architecture, boolean jdk) {
return copySpec {
into("elasticsearch-${version}") {
into('lib') {
with libFiles(oss)
with libFiles()
}
into('config') {
dirMode 0750
fileMode 0660
with configFiles(distributionType, oss, jdk)
with configFiles(distributionType, jdk)
from {
dirMode 0750
jvmOptionsDir.getParent()
}
}
into('bin') {
with binFiles(distributionType, oss, jdk)
with binFiles(distributionType, jdk)
}
if (jdk) {
into("darwin".equals(platform) ? 'jdk.app' : 'jdk') {
@@ -65,7 +62,7 @@ CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String pla
rename { 'LICENSE.txt' }
}

with noticeFile(oss, jdk)
with noticeFile(jdk)
into('modules') {
with modulesFiles
}
@@ -76,105 +73,56 @@ CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String pla
distribution_archives {
integTestZip {
content {
archiveFiles(transportModulesFiles, 'zip', null, 'x64', true, false)
}
}

windowsZip {
archiveClassifier = 'windows-x86_64'
content {
archiveFiles(modulesFiles(false, 'windows-x86_64'), 'zip', 'windows', 'x64', false, true)
archiveFiles(transportModulesFiles, 'zip', null, 'x64', false)
}
}

ossWindowsZip {
archiveClassifier = 'windows-x86_64'
content {
archiveFiles(modulesFiles(true, 'windows-x86_64'), 'zip', 'windows', 'x64', true, true)
}
}

noJdkWindowsZip {
archiveClassifier = 'no-jdk-windows-x86_64'
content {
archiveFiles(modulesFiles(false, 'windows-x86_64'), 'zip', 'windows', 'x64', false, false)
archiveFiles(modulesFiles('windows-x86_64'), 'zip', 'windows', 'x64', true)
}
}

ossNoJdkWindowsZip {
archiveClassifier = 'no-jdk-windows-x86_64'
content {
archiveFiles(modulesFiles(true, 'windows-x86_64'), 'zip', 'windows', 'x64', true, false)
}
}

darwinTar {
archiveClassifier = 'darwin-x86_64'
content {
archiveFiles(modulesFiles(false, 'darwin-x86_64'), 'tar', 'darwin', 'x64', false, true)
archiveFiles(modulesFiles('windows-x86_64'), 'zip', 'windows', 'x64', false)
}
}

ossDarwinTar {
archiveClassifier = 'darwin-x86_64'
content {
archiveFiles(modulesFiles(true, 'darwin-x86_64'), 'tar', 'darwin', 'x64', true, true)
}
}

noJdkDarwinTar {
archiveClassifier = 'no-jdk-darwin-x86_64'
content {
archiveFiles(modulesFiles(false, 'darwin-x86_64'), 'tar', 'darwin', 'x64', false, false)
archiveFiles(modulesFiles('darwin-x86_64'), 'tar', 'darwin', 'x64', true)
}
}

ossNoJdkDarwinTar {
archiveClassifier = 'no-jdk-darwin-x86_64'
content {
archiveFiles(modulesFiles(true, 'darwin-x86_64'), 'tar', 'darwin', 'x64', true, false)
}
}

linuxAarch64Tar {
archiveClassifier = 'linux-aarch64'
content {
archiveFiles(modulesFiles(false, 'linux-aarch64'), 'tar', 'linux', 'aarch64', false, true)
}
}

linuxTar {
archiveClassifier = 'linux-x86_64'
content {
archiveFiles(modulesFiles(false, 'linux-x86_64'), 'tar', 'linux', 'x64', false, true)
archiveFiles(modulesFiles('darwin-x86_64'), 'tar', 'darwin', 'x64', false)
}
}

ossLinuxAarch64Tar {
archiveClassifier = 'linux-aarch64'
content {
archiveFiles(modulesFiles(true, 'linux-aarch64'), 'tar', 'linux', 'aarch64', true, true)
archiveFiles(modulesFiles('linux-aarch64'), 'tar', 'linux', 'aarch64', true)
}
}

ossLinuxTar {
archiveClassifier = 'linux-x86_64'
content {
archiveFiles(modulesFiles(true, 'linux-x86_64'), 'tar', 'linux', 'x64', true, true)
}
}

noJdkLinuxTar {
archiveClassifier = 'no-jdk-linux-x86_64'
content {
archiveFiles(modulesFiles(false, 'linux-x86_64'), 'tar', 'linux', 'x64', false, false)
archiveFiles(modulesFiles('linux-x86_64'), 'tar', 'linux', 'x64', true)
}
}

ossNoJdkLinuxTar {
archiveClassifier = 'no-jdk-linux-x86_64'
content {
archiveFiles(modulesFiles(true, 'linux-x86_64'), 'tar', 'linux', 'x64', true, false)
archiveFiles(modulesFiles('linux-x86_64'), 'tar', 'linux', 'x64', false)
}
}
}
@@ -183,5 +131,5 @@ subprojects {
apply plugin: 'distribution'
apply plugin: 'elasticsearch.internal-distribution-archive-check'

group = "org.elasticsearch.distribution.${name.startsWith("oss-") ? "oss" : "default"}"
group = "org.elasticsearch.distribution.oss"
}
2 changes: 0 additions & 2 deletions distribution/archives/darwin-tar/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/linux-aarch64-tar/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/linux-tar/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/no-jdk-darwin-tar/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/no-jdk-linux-tar/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/no-jdk-windows-zip/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/archives/windows-zip/build.gradle

This file was deleted.

77 changes: 19 additions & 58 deletions distribution/build.gradle
Original file line number Diff line number Diff line change
@@ -49,15 +49,6 @@ tasks.register("generateDependenciesReport", ConcatFilesTask) {
String sourceUrl = "https://hg.openjdk.java.net/jdk-updates/jdk${jdkMajorVersion}u/archive/jdk-${jdkVersion}.tar.gz"
additionalLines << "OpenJDK,${jdkVersion},https://openjdk.java.net/,GPL-2.0-with-classpath-exception,${sourceUrl}".toString()

// Explicitly add the dependency on the RHEL UBI Docker base image
String[] rhelUbiFields = [
'Red Hat Universal Base Image minimal',
'8',
'https://catalog.redhat.com/software/containers/ubi8/ubi-minimal/5c359a62bed8bd75a2c3fba8',
'Custom;https://www.redhat.com/licenses/EULA_Red_Hat_Universal_Base_Image_English_20190422.pdf',
'https://oss-dependencies.elastic.co/redhat/ubi/ubi-minimal-8-source.tar.gz'
]
additionalLines << rhelUbiFields.join(',')
}

/*****************************************************************************
@@ -334,7 +325,7 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
/*****************************************************************************
* Common files in all distributions *
*****************************************************************************/
libFiles = { oss ->
libFiles = {
copySpec {
// delay by using closures, since they have not yet been configured, so no jar task exists yet
from(configurations.libs)
@@ -344,15 +335,10 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
into('tools/keystore-cli') {
from(configurations.libsKeystoreCli)
}
if (oss == false) {
into('tools/security-cli') {
from(configurations.libsSecurityCli)
}
}
}
}

modulesFiles = { oss, platform ->
modulesFiles = { platform ->
copySpec {
eachFile {
if (it.relativePath.segments[-2] == 'bin' || (platform == 'darwin-x86_64' && it.relativePath.segments[-2] == 'MacOS')) {
@@ -363,12 +349,7 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
it.mode = 0644
}
}
def buildModules
if (oss) {
buildModules = buildOssModulesTaskProvider
} else {
buildModules = buildDefaultModulesTaskProvider
}
def buildModules = buildOssModulesTaskProvider
List excludePlatforms = ['linux-x86_64', 'linux-aarch64', 'windows-x86_64', 'darwin-x86_64']
if (platform != null) {
excludePlatforms.remove(excludePlatforms.indexOf(platform))
@@ -393,41 +374,36 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
from buildTransportModulesTaskProvider
}

configFiles = { distributionType, oss, jdk ->
configFiles = { distributionType, jdk ->
copySpec {
with copySpec {
// main config files, processed with distribution specific substitutions
from '../src/config'
exclude 'log4j2.properties' // this is handled separately below
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss, jdk))
}
if (oss) {
from project(':distribution').buildOssLog4jConfig
from project(':distribution').buildOssConfig
} else {
from project(':distribution').buildDefaultLog4jConfig
from project(':distribution').buildDefaultConfig
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, jdk))
}
from project(':distribution').buildOssLog4jConfig
from project(':distribution').buildOssConfig
}
}

binFiles = { distributionType, oss, jdk ->
binFiles = { distributionType, jdk ->
copySpec {
// non-windows files, for all distributions
with copySpec {
from '../src/bin'
exclude '*.exe'
exclude '*.bat'
eachFile { it.setMode(0755) }
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss, jdk))
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, jdk))
}
// windows files, only for zip
if (distributionType == 'zip') {
with copySpec {
from '../src/bin'
include '*.bat'
filter(FixCrLfFilter, eol: FixCrLfFilter.CrLf.newInstance('crlf'))
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss, jdk))
MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, jdk))
}
with copySpec {
from '../src/bin'
@@ -437,31 +413,23 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
// module provided bin files
with copySpec {
eachFile { it.setMode(0755) }
if (oss) {
from project(':distribution').buildOssBin
} else {
from project(':distribution').buildDefaultBin
}
from project(':distribution').buildOssBin
if (distributionType != 'zip') {
exclude '*.bat'
}
}
}
}

noticeFile = { oss, jdk ->
noticeFile = { jdk ->
copySpec {
if (project.name == 'integ-test-zip') {
from buildServerNoticeTaskProvider
} else {
if (oss && jdk) {
if (jdk) {
from buildOssNoticeTaskProvider
} else if (oss) {
from buildOssNoJdkNoticeTaskProvider
} else if (jdk) {
from buildDefaultNoticeTaskProvider
} else {
from buildDefaultNoJdkNoticeTaskProvider
from buildOssNoJdkNoticeTaskProvider
}
}
}
@@ -522,7 +490,7 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
* </dl>
*/
subprojects {
ext.expansionsForDistribution = { distributionType, oss, jdk ->
ext.expansionsForDistribution = { distributionType, jdk ->
final String defaultHeapSize = "1g"
final String packagingPathData = "path.data: /var/lib/elasticsearch"
final String pathLogs = "/var/log/elasticsearch"
@@ -598,11 +566,6 @@ subprojects {
'def': footer
],

'es.distribution.flavor': [
'def': oss ? 'oss' : 'default'
],


'es.distribution.type': [
'deb': 'deb',
'rpm': 'rpm',
@@ -649,13 +612,11 @@ subprojects {
}
}

['archives:windows-zip', 'archives:oss-windows-zip',
'archives:darwin-tar', 'archives:oss-darwin-tar',
'archives:linux-aarch64-tar', 'archives:oss-linux-aarch64-tar',
'archives:linux-tar', 'archives:oss-linux-tar',
['archives:oss-windows-zip',
'archives:oss-darwin-tar',
'archives:oss-linux-aarch64-tar',
'archives:oss-linux-tar',
'archives:integ-test-zip',
'packages:rpm', 'packages:deb',
'packages:aarch64-rpm', 'packages:aarch64-deb',
'packages:oss-rpm', 'packages:oss-deb',
'packages:aarch64-oss-rpm', 'packages:aarch64-oss-deb'
].forEach { subName ->
122 changes: 37 additions & 85 deletions distribution/docker/build.gradle
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
import org.elasticsearch.gradle.Architecture
import org.elasticsearch.gradle.DockerBase
import org.elasticsearch.gradle.ElasticsearchDistribution.Flavor
import org.elasticsearch.gradle.LoggedExec
import org.elasticsearch.gradle.VersionProperties
import org.elasticsearch.gradle.docker.DockerBuildTask
@@ -14,20 +13,16 @@ apply plugin: 'elasticsearch.rest-resources'
testFixtures.useFixture()

configurations {
aarch64DockerSource
dockerSource
aarch64OssDockerSource
ossDockerSource
}

dependencies {
aarch64DockerSource project(path: ":distribution:archives:linux-aarch64-tar", configuration:"default")
dockerSource project(path: ":distribution:archives:linux-tar", configuration:"default")
aarch64OssDockerSource project(path: ":distribution:archives:oss-linux-aarch64-tar", configuration:"default")
ossDockerSource project(path: ":distribution:archives:oss-linux-tar", configuration:"default")
}

ext.expansions = { Architecture architecture, boolean oss, DockerBase base, boolean local ->
ext.expansions = { Architecture architecture, DockerBase base, boolean local ->
String classifier
if (local) {
if (architecture == Architecture.AARCH64) {
@@ -44,7 +39,7 @@ ext.expansions = { Architecture architecture, boolean oss, DockerBase base, bool
classifier = "linux-\$(arch)"
}

final String elasticsearch = "elasticsearch-${oss ? 'oss-' : ''}${VersionProperties.elasticsearch}-${classifier}.tar.gz"
final String elasticsearch = "elasticsearch-oss-${VersionProperties.elasticsearch}-${classifier}.tar.gz"

/* Both the following Dockerfile commands put the resulting artifact at
* the same location, regardless of classifier, so that the commands that
@@ -66,83 +61,62 @@ RUN curl --retry 8 -S -L \\
'build_date' : BuildParams.buildDate,
'git_revision' : BuildParams.gitRevision,
'license' : 'Apache-2.0',
'package_manager' : base == DockerBase.UBI ? 'microdnf' : 'yum',
'package_manager' : 'yum',
'source_elasticsearch': sourceElasticsearch,
'docker_base' : base.name().toLowerCase(),
'version' : VersionProperties.elasticsearch
]
}

private static String buildPath(Architecture architecture, boolean oss, DockerBase base) {
private static String buildPath(Architecture architecture, DockerBase base) {
return 'build/' +
(architecture == Architecture.AARCH64 ? 'aarch64-' : '') +
(oss ? 'oss-' : '') +
(base == DockerBase.UBI ? 'ubi-' : '') +
'oss-' +
'docker'
}

private static String taskName(String prefix, Architecture architecture, boolean oss, DockerBase base, String suffix) {
private static String taskName(String prefix, Architecture architecture, DockerBase base, String suffix) {
return prefix +
(architecture == Architecture.AARCH64 ? 'Aarch64' : '') +
(oss ? 'Oss' : '') +
(base == DockerBase.UBI ? 'Ubi' : '') +
'Oss' +
suffix
}

project.ext {
dockerBuildContext = { Architecture architecture, boolean oss, DockerBase base, boolean local ->
dockerBuildContext = { Architecture architecture, DockerBase base, boolean local ->
copySpec {
into('bin') {
from project.projectDir.toPath().resolve("src/docker/bin")
}

into('config') {
/*
* The OSS and default distributions have different configurations, therefore we want to allow overriding the default configuration
* from files in the 'oss' sub-directory. We don't want the 'oss' sub-directory to appear in the final build context, however.
*/
duplicatesStrategy = DuplicatesStrategy.EXCLUDE
from(project.projectDir.toPath().resolve("src/docker/config")) {
exclude 'oss'
}
if (oss) {
// Overlay the config file
from project.projectDir.toPath().resolve("src/docker/config/oss")
}
from project.projectDir.toPath().resolve("src/docker/config")
}

from(project.projectDir.toPath().resolve("src/docker/Dockerfile")) {
expand(expansions(architecture, oss, base, local))
expand(expansions(architecture, base, local))
}
}
}
}

void addCopyDockerContextTask(Architecture architecture, boolean oss, DockerBase base) {
if (oss && base != DockerBase.CENTOS) {
void addCopyDockerContextTask(Architecture architecture, DockerBase base) {
if (base != DockerBase.CENTOS) {
throw new GradleException("The only allowed docker base image for OSS builds is CENTOS")
}

tasks.register(taskName("copy", architecture, oss, base, "DockerContext"), Sync) {
expansions(architecture, oss, base, true).findAll { it.key != 'build_date' }.each { k, v ->
tasks.register(taskName("copy", architecture, base, "DockerContext"), Sync) {
expansions(architecture, base, true).findAll { it.key != 'build_date' }.each { k, v ->
inputs.property(k, { v.toString() })
}
into buildPath(architecture, oss, base)
into buildPath(architecture, base)

with dockerBuildContext(architecture, oss, base, true)
with dockerBuildContext(architecture, base, true)

if (architecture == Architecture.AARCH64) {
if (oss) {
from configurations.aarch64OssDockerSource
} else {
from configurations.aarch64DockerSource
}
from configurations.aarch64OssDockerSource
} else {
if (oss) {
from configurations.ossDockerSource
} else {
from configurations.dockerSource
}
from configurations.ossDockerSource
}
}
}
@@ -157,31 +131,24 @@ def createAndSetWritable(Object... locations) {

elasticsearch_distributions {
Architecture.values().each { eachArchitecture ->
Flavor.values().each { distroFlavor ->
"docker_$distroFlavor${ eachArchitecture == Architecture.AARCH64 ? '_aarch64' : '' }" {
architecture = eachArchitecture
flavor = distroFlavor
type = 'docker'
version = VersionProperties.getElasticsearch()
failIfUnavailable = false // This ensures we don't attempt to build images if docker is unavailable
}
"docker${ eachArchitecture == Architecture.AARCH64 ? '_aarch64' : '' }" {
architecture = eachArchitecture
type = 'docker'
version = VersionProperties.getElasticsearch()
failIfUnavailable = false // This ensures we don't attempt to build images if docker is unavailable
}
}
}

tasks.named("preProcessFixture").configure {
dependsOn elasticsearch_distributions.docker_default, elasticsearch_distributions.docker_oss
dependsOn elasticsearch_distributions.docker
doLast {
// tests expect to have an empty repo
project.delete(
"${buildDir}/repo",
"${buildDir}/oss-repo"
)
createAndSetWritable(
"${buildDir}/repo",
"${buildDir}/oss-repo",
"${buildDir}/logs/default-1",
"${buildDir}/logs/default-2",
"${buildDir}/logs/oss-1",
"${buildDir}/logs/oss-2"
)
@@ -198,50 +165,36 @@ tasks.named("check").configure {
dependsOn "integTest"
}

void addBuildDockerImage(Architecture architecture, boolean oss, DockerBase base) {
if (oss && base != DockerBase.CENTOS) {
void addBuildDockerImage(Architecture architecture, DockerBase base) {
if (base != DockerBase.CENTOS) {
throw new GradleException("The only allowed docker base image for OSS builds is CENTOS")
}

final TaskProvider<DockerBuildTask> buildDockerImageTask =
tasks.register(taskName("build", architecture, oss, base, "DockerImage"), DockerBuildTask) {
tasks.register(taskName("build", architecture, base, "DockerImage"), DockerBuildTask) {
onlyIf { Architecture.current() == architecture }
TaskProvider<Sync> copyContextTask = tasks.named(taskName("copy", architecture, oss, base, "DockerContext"))
TaskProvider<Sync> copyContextTask = tasks.named(taskName("copy", architecture, base, "DockerContext"))
dependsOn(copyContextTask)
dockerContext.fileProvider(copyContextTask.map { it.destinationDir })
baseImages = [ base.getImage() ]

String version = VersionProperties.elasticsearch
if (oss) {
tags = [
"docker.elastic.co/elasticsearch/elasticsearch-oss:${version}",
"elasticsearch-oss:test"
]
} else {
String suffix = base == DockerBase.UBI ? '-ubi8' : ''
tags = [
"elasticsearch${suffix}:${version}",
"docker.elastic.co/elasticsearch/elasticsearch${suffix}:${version}",
"docker.elastic.co/elasticsearch/elasticsearch-full${suffix}:${version}",
"elasticsearch${suffix}:test",
]
}
}
tasks.named("assemble").configure {
dependsOn(buildDockerImageTask)
}
}

for (final Architecture architecture : Architecture.values()) {
// We only create Docker images for the OSS distribution on CentOS.
for (final DockerBase base : DockerBase.values()) {
for (final boolean oss : [false, true]) {
if (oss && base != DockerBase.CENTOS) {
// We only create Docker images for the OSS distribution on CentOS.
// Other bases only use the default distribution.
continue
}
addCopyDockerContextTask(architecture, oss, base)
addBuildDockerImage(architecture, oss, base)
if (base == DockerBase.CENTOS) {
addCopyDockerContextTask(architecture, base)
addBuildDockerImage(architecture, base)
}
}
}
@@ -262,16 +215,15 @@ subprojects { Project subProject ->
apply plugin: 'distribution'

final Architecture architecture = subProject.name.contains('aarch64-') ? Architecture.AARCH64 : Architecture.X64
final boolean oss = subProject.name.contains('oss-')
final DockerBase base = subProject.name.contains('ubi-') ? DockerBase.UBI : DockerBase.CENTOS
final DockerBase base = DockerBase.CENTOS

final String arch = architecture == Architecture.AARCH64 ? '-aarch64' : ''
final String suffix = oss ? '-oss' : base == DockerBase.UBI ? '-ubi8' : ''
final String extension = base == DockerBase.UBI ? 'ubi.tar' : 'docker.tar'
final String suffix = '-oss'
final String extension = 'docker.tar'
final String artifactName = "elasticsearch${arch}${suffix}_test"

final String exportTaskName = taskName("export", architecture, oss, base, "DockerImage")
final String buildTaskName = taskName("build", architecture, oss, base, "DockerImage")
final String exportTaskName = taskName("export", architecture, base, "DockerImage")
final String buildTaskName = taskName("build", architecture, base, "DockerImage")
final String tarFile = "${parent.projectDir}/build/${artifactName}_${VersionProperties.elasticsearch}.${extension}"

tasks.register(exportTaskName, LoggedExec) {
2 changes: 0 additions & 2 deletions distribution/docker/docker-aarch64-export/build.gradle

This file was deleted.

14 changes: 0 additions & 14 deletions distribution/docker/docker-build-context/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/docker/docker-export/build.gradle

This file was deleted.

2 changes: 1 addition & 1 deletion distribution/docker/docker-test-entrypoint.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/bin/bash
cd /usr/share/elasticsearch/bin/
./elasticsearch-users useradd x_pack_rest_user -p x-pack-test-password -r superuser || true
./elasticsearch-users useradd rest_user -p test-password -r superuser || true
echo "testnode" > /tmp/password
/usr/local/bin/docker-entrypoint.sh | tee > /usr/share/elasticsearch/logs/console.log
2 changes: 1 addition & 1 deletion distribution/docker/oss-docker-build-context/build.gradle
Original file line number Diff line number Diff line change
@@ -8,7 +8,7 @@ tasks.register("buildOssDockerBuildContext", Tar) {
archiveClassifier = "docker-build-context"
archiveBaseName = "elasticsearch-oss"
// Non-local builds don't need to specify an architecture.
with dockerBuildContext(null, true, DockerBase.CENTOS, false)
with dockerBuildContext(null, DockerBase.CENTOS, false)
}

tasks.named("assemble").configure { dependsOn "buildOssDockerBuildContext" }
18 changes: 1 addition & 17 deletions distribution/docker/src/docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -20,10 +20,6 @@
################################################################################
FROM ${base_image} AS builder
<% if (docker_base == 'ubi') { %>
# Install required packages to extract the Elasticsearch distribution
RUN ${package_manager} install -y tar gzip
<% } %>
# `tini` is a tiny but valid init for containers. This is used to cleanly
# control how ES and any child processes are shut down.
#
@@ -70,7 +66,7 @@ ENV ELASTIC_CONTAINER true
RUN for iter in {1..10}; do \\
${package_manager} update --setopt=tsflags=nodocs -y && \\
${package_manager} install --setopt=tsflags=nodocs -y \\
nc shadow-utils zip unzip <%= docker_base == 'ubi' ? 'findutils procps-ng' : '' %> && \\
nc shadow-utils zip unzip && \\
${package_manager} clean all && exit_code=0 && break || exit_code=\$? && echo "${package_manager} error: retry \$iter in 10s" && \\
sleep 10; \\
done; \\
@@ -124,18 +120,6 @@ LABEL org.label-schema.build-date="${build_date}" \\
org.opencontainers.image.url="https://www.elastic.co/products/elasticsearch" \\
org.opencontainers.image.vendor="Elastic" \\
org.opencontainers.image.version="${version}"
<% if (docker_base == 'ubi') { %>
LABEL name="Elasticsearch" \\
maintainer="infra@elastic.co" \\
vendor="Elastic" \\
version="${version}" \\
release="1" \\
summary="Elasticsearch" \\
description="You know, for search."

RUN mkdir /licenses && \\
cp LICENSE.txt /licenses/LICENSE
<% } %>

ENTRYPOINT ["/tini", "--", "/usr/local/bin/docker-entrypoint.sh"]
# Dummy overridable parameter parsed by entrypoint
Original file line number Diff line number Diff line change
@@ -21,12 +21,8 @@
import com.carrotsearch.randomizedtesting.annotations.ParametersFactory;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.client.Request;
import org.elasticsearch.common.CharArrays;
import org.elasticsearch.common.io.PathUtils;
import org.elasticsearch.common.settings.SecureString;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.util.concurrent.ThreadContext;
import org.elasticsearch.test.rest.ESRestTestCase;
import org.elasticsearch.test.rest.yaml.ClientYamlTestCandidate;
import org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase;
import org.junit.AfterClass;
@@ -35,16 +31,13 @@

import java.io.IOException;
import java.net.URISyntaxException;
import java.nio.CharBuffer;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Arrays;
import java.util.Base64;

public class DockerYmlTestSuiteIT extends ESClientYamlSuiteTestCase {

private static final String USER = "x_pack_rest_user";
private static final String PASS = "x-pack-test-password";
private static final String USER = "rest_user";
private static final String PASS = "test-password";
private static final String KEYSTORE_PASS = "testnode";

public DockerYmlTestSuiteIT(ClientYamlTestCandidate testCandidate) {
@@ -58,13 +51,12 @@ public static Iterable<Object[]> parameters() throws Exception {

@Override
protected String getTestRestCluster() {
String distribution = getDistribution();
return new StringBuilder()
.append("localhost:")
.append(getProperty("test.fixtures.elasticsearch-" + distribution + "-1.tcp.9200"))
.append(getProperty("test.fixtures.elasticsearch-oss-1.tcp.9200"))
.append(",")
.append("localhost:")
.append(getProperty("test.fixtures.elasticsearch-" + distribution + "-2.tcp.9200"))
.append(getProperty("test.fixtures.elasticsearch-oss-2.tcp.9200"))
.toString();
}

@@ -73,18 +65,6 @@ protected boolean randomizeContentType() {
return false;
}

private String getDistribution() {
String distribution = System.getProperty("tests.distribution", "default");
if (distribution.equals("oss") == false && distribution.equals("default") == false) {
throw new IllegalArgumentException("supported values for tests.distribution are oss or default but it was " + distribution);
}
return distribution;
}

private boolean isOss() {
return getDistribution().equals("oss");
}

private String getProperty(String key) {
String value = System.getProperty(key);
if (value == null) {
@@ -124,40 +104,11 @@ public static void clearKeyStore() {

@Override
protected Settings restClientSettings() {
if (isOss()) {
return super.restClientSettings();
}
String token = basicAuthHeaderValue(USER, new SecureString(PASS.toCharArray()));
return Settings.builder()
.put(ThreadContext.PREFIX + ".Authorization", token)
.put(ESRestTestCase.TRUSTSTORE_PATH, keyStore)
.put(ESRestTestCase.TRUSTSTORE_PASSWORD, KEYSTORE_PASS)
.build();
return super.restClientSettings();
}

@Override
protected String getProtocol() {
if (isOss()) {
return "http";
}
return "https";
}

private static String basicAuthHeaderValue(String username, SecureString passwd) {
CharBuffer chars = CharBuffer.allocate(username.length() + passwd.length() + 1);
byte[] charBytes = null;
try {
chars.put(username).put(':').put(passwd.getChars());
charBytes = CharArrays.toUtf8Bytes(chars.array());

//TODO we still have passwords in Strings in headers. Maybe we can look into using a CharSequence?
String basicToken = Base64.getEncoder().encodeToString(charBytes);
return "Basic " + basicToken;
} finally {
Arrays.fill(chars.array(), (char) 0);
if (charBytes != null) {
Arrays.fill(charBytes, (byte) 0);
}
}
return "http";
}
}
2 changes: 0 additions & 2 deletions distribution/docker/ubi-docker-aarch64-export/build.gradle

This file was deleted.

13 changes: 0 additions & 13 deletions distribution/docker/ubi-docker-build-context/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/docker/ubi-docker-export/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/packages/aarch64-deb/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/packages/aarch64-rpm/build.gradle

This file was deleted.

110 changes: 32 additions & 78 deletions distribution/packages/build.gradle
Original file line number Diff line number Diff line change
@@ -55,23 +55,23 @@ plugins {
id "nebula.ospackage-base" version "8.3.0"
}

void addProcessFilesTask(String type, boolean oss, boolean jdk) {
String packagingFiles = "build/packaging/${oss ? 'oss-' : ''}${jdk ? '' : 'no-jdk-'}${type}"
void addProcessFilesTask(String type, boolean jdk) {
String packagingFiles = "build/packaging/oss-${jdk ? '' : 'no-jdk-'}${type}"

String taskName = "process${oss ? 'Oss' : ''}${jdk ? '' : 'NoJdk'}${type.capitalize()}Files"
String taskName = "processOss'${jdk ? '' : 'NoJdk'}${type.capitalize()}Files"
tasks.register(taskName, Copy) {
into packagingFiles

with copySpec {
from 'src/common'
from "src/${type}"
MavenFilteringHack.filter(it, expansionsForDistribution(type, oss, jdk))
MavenFilteringHack.filter(it, expansionsForDistribution(type, jdk))
}

into('etc/elasticsearch') {
with configFiles(type, oss, jdk)
with configFiles(type, jdk)
}
MavenFilteringHack.filter(it, expansionsForDistribution(type, oss, jdk))
MavenFilteringHack.filter(it, expansionsForDistribution(type, jdk))

doLast {
// create empty dirs, we set the permissions when configuring the packages
@@ -86,25 +86,21 @@ void addProcessFilesTask(String type, boolean oss, boolean jdk) {
}
}

addProcessFilesTask('deb', true, true)
addProcessFilesTask('deb', true, false)
addProcessFilesTask('deb', false, true)
addProcessFilesTask('deb', false, false)
addProcessFilesTask('rpm', true, true)
addProcessFilesTask('rpm', true, false)
addProcessFilesTask('rpm', false, true)
addProcessFilesTask('rpm', false, false)
addProcessFilesTask('deb', true)
addProcessFilesTask('deb', false)
addProcessFilesTask('rpm', true)
addProcessFilesTask('rpm', false)

// Common configuration that is package dependent. This can't go in ospackage
// since we have different templated files that need to be consumed, but the structure
// is the same
Closure commonPackageConfig(String type, boolean oss, boolean jdk, String architecture) {
Closure commonPackageConfig(String type, boolean jdk, String architecture) {
return {
onlyIf {
OS.current().equals(OS.WINDOWS) == false
}
dependsOn "process${oss ? 'Oss' : ''}${jdk ? '' : 'NoJdk'}${type.capitalize()}Files"
packageName "elasticsearch${oss ? '-oss' : ''}"
dependsOn "processOss'${jdk ? '' : 'NoJdk'}${type.capitalize()}Files"
packageName "elasticsearch-oss"
if (type == 'deb') {
if (architecture == 'x64') {
arch('amd64')
@@ -123,13 +119,13 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk, String archit
}
// Follow elasticsearch's file naming convention
String jdkString = jdk ? "" : "no-jdk-"
String prefix = "${architecture == 'aarch64' ? 'aarch64-' : ''}${oss ? 'oss-' : ''}${jdk ? '' : 'no-jdk-'}${type}"
String prefix = "${architecture == 'aarch64' ? 'aarch64-' : ''}oss-${jdk ? '' : 'no-jdk-'}${type}"
destinationDirectory = file("${prefix}/build/distributions")

// SystemPackagingTask overrides default archive task convention mappings, but doesn't provide a setter so we have to override the convention mapping itself
conventionMapping.archiveFile = { objects.fileProperty().fileValue(file("${destinationDirectory.get()}/${packageName}-${project.version}-${jdkString}${archString}.${type}")) }

String packagingFiles = "build/packaging/${oss ? 'oss-' : ''}${jdk ? '' : 'no-jdk-'}${type}"
String packagingFiles = "build/packaging/oss-${jdk ? '' : 'no-jdk-'}${type}"

String scripts = "${packagingFiles}/scripts"
preInstall file("${scripts}/preinst")
@@ -144,17 +140,17 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk, String archit
// specify it again explicitly for copying common files
into('/usr/share/elasticsearch') {
into('bin') {
with binFiles(type, oss, jdk)
with binFiles(type, jdk)
}
from(rootProject.projectDir) {
include 'README.asciidoc'
fileMode 0644
}
into('lib') {
with libFiles(oss)
with libFiles()
}
into('modules') {
with modulesFiles(oss, 'linux-' + ((architecture == 'x64') ? 'x86_64' : architecture))
with modulesFiles('linux-' + ((architecture == 'x64') ? 'x86_64' : architecture))
}
if (jdk) {
into('jdk') {
@@ -200,12 +196,6 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk, String archit
configurationFile '/etc/elasticsearch/elasticsearch.yml'
configurationFile '/etc/elasticsearch/jvm.options'
configurationFile '/etc/elasticsearch/log4j2.properties'
if (oss == false) {
configurationFile '/etc/elasticsearch/role_mapping.yml'
configurationFile '/etc/elasticsearch/roles.yml'
configurationFile '/etc/elasticsearch/users'
configurationFile '/etc/elasticsearch/users_roles'
}
from("${packagingFiles}") {
dirMode 02750
into('/etc')
@@ -224,7 +214,7 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk, String archit
createDirectoryEntry true
fileType CONFIG | NOREPLACE
}
String envFile = expansionsForDistribution(type, oss, jdk)['path.env']
String envFile = expansionsForDistribution(type, jdk)['path.env']
configurationFile envFile
into(new File(envFile).getParent()) {
fileType CONFIG | NOREPLACE
@@ -279,11 +269,8 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk, String archit
copyEmptyDir('/var/lib/elasticsearch', 'elasticsearch', 'elasticsearch', 02750)
copyEmptyDir('/usr/share/elasticsearch/plugins', 'root', 'root', 0755)

// the oss package conflicts with the default distribution and vice versa
conflicts('elasticsearch' + (oss ? '' : '-oss'))

into '/usr/share/elasticsearch'
with noticeFile(oss, jdk)
with noticeFile(jdk)
}
}

@@ -321,9 +308,9 @@ ospackage {
into '/usr/share/elasticsearch'
}

Closure commonDebConfig(boolean oss, boolean jdk, String architecture) {
Closure commonDebConfig(boolean jdk, String architecture) {
return {
configure(commonPackageConfig('deb', oss, jdk, architecture))
configure(commonPackageConfig('deb', jdk, architecture))

// jdeb does not provide a way to set the License control attribute, and ospackage
// silently ignores setting it. Instead, we set the license as "custom field"
@@ -340,40 +327,26 @@ Closure commonDebConfig(boolean oss, boolean jdk, String architecture) {

into('/usr/share/lintian/overrides') {
from('src/deb/lintian/elasticsearch')
if (oss) {
rename('elasticsearch', 'elasticsearch-oss')
}
rename('elasticsearch', 'elasticsearch-oss')
}
}
}

tasks.register('buildAarch64Deb', Deb) {
configure(commonDebConfig(false, true, 'aarch64'))
}

tasks.register('buildDeb', Deb) {
configure(commonDebConfig(false, true, 'x64'))
}

tasks.register('buildAarch64OssDeb', Deb) {
configure(commonDebConfig(true, true, 'aarch64'))
configure(commonDebConfig(true, 'aarch64'))
}

tasks.register('buildOssDeb', Deb) {
configure(commonDebConfig(true, true, 'x64'))
}

tasks.register('buildNoJdkDeb', Deb) {
configure(commonDebConfig(false, false, 'x64'))
configure(commonDebConfig(true, 'x64'))
}

tasks.register('buildOssNoJdkDeb', Deb) {
configure(commonDebConfig(true, false, 'x64'))
configure(commonDebConfig(true, 'x64'))
}

Closure commonRpmConfig(boolean oss, boolean jdk, String architecture) {
Closure commonRpmConfig(boolean jdk, String architecture) {
return {
configure(commonPackageConfig('rpm', oss, jdk, architecture))
configure(commonPackageConfig('rpm', jdk, architecture))

license 'ASL 2.0'

@@ -396,28 +369,16 @@ Closure commonRpmConfig(boolean oss, boolean jdk, String architecture) {
}
}

tasks.register('buildAarch64Rpm', Rpm) {
configure(commonRpmConfig(false, true, 'aarch64'))
}

tasks.register('buildRpm', Rpm) {
configure(commonRpmConfig(false, true, 'x64'))
}

tasks.register('buildAarch64OssRpm', Rpm) {
configure(commonRpmConfig(true, true, 'aarch64'))
configure(commonRpmConfig(true, 'aarch64'))
}

tasks.register('buildOssRpm', Rpm) {
configure(commonRpmConfig(true, true, 'x64'))
}

tasks.register('buildNoJdkRpm', Rpm) {
configure(commonRpmConfig(false, false, 'x64'))
configure(commonRpmConfig(true, 'x64'))
}

tasks.register('buildOssNoJdkRpm', Rpm) {
configure(commonRpmConfig(true, false, 'x64'))
configure(commonRpmConfig(true, 'x64'))
}

Closure dpkgExists = { it -> new File('/bin/dpkg-deb').exists() || new File('/usr/bin/dpkg-deb').exists() || new File('/usr/local/bin/dpkg-deb').exists() }
@@ -494,14 +455,7 @@ subprojects {
Path copyrightPath
String expectedLicense
String licenseFilename
if (project.name.contains('oss-')) {
copyrightPath = packageExtractionDir.toPath().resolve("usr/share/doc/elasticsearch-oss/copyright")
}
// TODO - remove this block and only check for the OSS distribution
// https://github.com/opendistro-for-elasticsearch/search/issues/50
else {
copyrightPath = packageExtractionDir.toPath().resolve("usr/share/doc/elasticsearch/copyright")
}
copyrightPath = packageExtractionDir.toPath().resolve("usr/share/doc/elasticsearch-oss/copyright")
expectedLicense = "ASL-2.0"
licenseFilename = "APACHE-LICENSE-2.0.txt"
final List<String> header = Arrays.asList("Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/",
2 changes: 0 additions & 2 deletions distribution/packages/deb/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/packages/no-jdk-deb/build.gradle

This file was deleted.

2 changes: 0 additions & 2 deletions distribution/packages/no-jdk-rpm/build.gradle

This file was deleted.

Binary file not shown.
2 changes: 0 additions & 2 deletions distribution/packages/rpm/build.gradle

This file was deleted.

Binary file not shown.
2 changes: 0 additions & 2 deletions distribution/src/bin/elasticsearch
Original file line number Diff line number Diff line change
@@ -62,7 +62,6 @@ if [[ $DAEMONIZE = false ]]; then
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
-Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
-Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-Des.bundled_jdk="$ES_BUNDLED_JDK" \
-cp "$ES_CLASSPATH" \
@@ -75,7 +74,6 @@ else
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
-Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
-Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-Des.bundled_jdk="$ES_BUNDLED_JDK" \
-cp "$ES_CLASSPATH" \
1 change: 0 additions & 1 deletion distribution/src/bin/elasticsearch-cli
Original file line number Diff line number Diff line change
@@ -26,7 +26,6 @@ exec \
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
-Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
-Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
"$ES_MAIN_CLASS" \
1 change: 0 additions & 1 deletion distribution/src/bin/elasticsearch-cli.bat
Original file line number Diff line number Diff line change
@@ -20,7 +20,6 @@ set ES_JAVA_OPTS=-Xms4m -Xmx64m -XX:+UseSerialGC %ES_JAVA_OPTS%
%ES_JAVA_OPTS% ^
-Des.path.home="%ES_HOME%" ^
-Des.path.conf="%ES_PATH_CONF%" ^
-Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" ^
-Des.distribution.type="%ES_DISTRIBUTION_TYPE%" ^
-cp "%ES_CLASSPATH%" ^
"%ES_MAIN_CLASS%" ^
1 change: 0 additions & 1 deletion distribution/src/bin/elasticsearch-env
Original file line number Diff line number Diff line change
@@ -88,7 +88,6 @@ fi
# now make ES_PATH_CONF absolute
ES_PATH_CONF=`cd "$ES_PATH_CONF"; pwd`

ES_DISTRIBUTION_FLAVOR=${es.distribution.flavor}
ES_DISTRIBUTION_TYPE=${es.distribution.type}
ES_BUNDLED_JDK=${es.bundled_jdk}

1 change: 0 additions & 1 deletion distribution/src/bin/elasticsearch-env.bat
Original file line number Diff line number Diff line change
@@ -25,7 +25,6 @@ if not defined ES_PATH_CONF (
rem now make ES_PATH_CONF absolute
for %%I in ("%ES_PATH_CONF%..") do set ES_PATH_CONF=%%~dpfI

set ES_DISTRIBUTION_FLAVOR=${es.distribution.flavor}
set ES_DISTRIBUTION_TYPE=${es.distribution.type}
set ES_BUNDLED_JDK=${es.bundled_jdk}

2 changes: 1 addition & 1 deletion distribution/src/bin/elasticsearch-service.bat
Original file line number Diff line number Diff line change
@@ -194,7 +194,7 @@ if "%JVM_SS%" == "" (
set OTHER_JAVA_OPTS=%OTHER_JAVA_OPTS:"=%
set OTHER_JAVA_OPTS=%OTHER_JAVA_OPTS:~1%

set ES_PARAMS=-Delasticsearch;-Des.path.home="%ES_HOME%";-Des.path.conf="%ES_PATH_CONF%";-Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%";-Des.distribution.type="%ES_DISTRIBUTION_TYPE%";-Des.bundled_jdk="%ES_BUNDLED_JDK%"
set ES_PARAMS=-Delasticsearch;-Des.path.home="%ES_HOME%";-Des.path.conf="%ES_PATH_CONF%";-Des.distribution.type="%ES_DISTRIBUTION_TYPE%";-Des.bundled_jdk="%ES_BUNDLED_JDK%"

if "%ES_START_TYPE%" == "" set ES_START_TYPE=manual
if "%ES_STOP_TIMEOUT%" == "" set ES_STOP_TIMEOUT=0
1 change: 0 additions & 1 deletion distribution/src/bin/elasticsearch.bat
Original file line number Diff line number Diff line change
@@ -99,7 +99,6 @@ SET KEYSTORE_PASSWORD=!KEYSTORE_PASSWORD:^\=^^^\!

ECHO.!KEYSTORE_PASSWORD!| %JAVA% %ES_JAVA_OPTS% -Delasticsearch ^
-Des.path.home="%ES_HOME%" -Des.path.conf="%ES_PATH_CONF%" ^
-Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" ^
-Des.distribution.type="%ES_DISTRIBUTION_TYPE%" ^
-Des.bundled_jdk="%ES_BUNDLED_JDK%" ^
-cp "%ES_CLASSPATH%" "org.elasticsearch.bootstrap.Elasticsearch" !newparams!
1 change: 0 additions & 1 deletion docs/plugins/discovery-azure-classic.asciidoc
Original file line number Diff line number Diff line change
@@ -350,7 +350,6 @@ This command should give you a JSON result:
"cluster_uuid" : "AT69_T_DTp-1qgIJlatQqA",
"version" : {
"number" : "{version_qualified}",
"build_flavor" : "{build_flavor}",
"build_type" : "{build_type}",
"build_hash" : "f27399d",
"build_date" : "2016-03-30T09:51:41.449Z",
2 changes: 0 additions & 2 deletions docs/reference/cluster/nodes-info.asciidoc
Original file line number Diff line number Diff line change
@@ -209,7 +209,6 @@ The API returns the following response:
"host": "node-0.elastic.co",
"ip": "192.168.17",
"version": "{version}",
"build_flavor": "{build_flavor}",
"build_type": "{build_type}",
"build_hash": "587409e",
"roles": [
@@ -280,7 +279,6 @@ The API returns the following response:
"host": "node-0.elastic.co",
"ip": "192.168.17",
"version": "{version}",
"build_flavor": "{build_flavor}",
"build_type": "{build_type}",
"build_hash": "587409e",
"roles": [],
1 change: 0 additions & 1 deletion docs/reference/setup/install/check-running.asciidoc
Original file line number Diff line number Diff line change
@@ -18,7 +18,6 @@ which should give you a response something like this:
"cluster_uuid" : "AT69_T_DTp-1qgIJlatQqA",
"version" : {
"number" : "{version_qualified}",
"build_flavor" : "{build_flavor}",
"build_type" : "{build_type}",
"build_hash" : "f27399d",
"build_date" : "2016-03-30T09:51:41.449Z",
1 change: 0 additions & 1 deletion gradle/local-distribution.gradle
Original file line number Diff line number Diff line change
@@ -29,7 +29,6 @@ apply plugin:'elasticsearch.internal-distribution-download'

elasticsearch_distributions {
local {
flavor = 'default'
type = 'archive'
architecture = Architecture.current()
}
Original file line number Diff line number Diff line change
@@ -339,12 +339,6 @@ public void test91ElasticsearchShardCliPackaging() throws Exception {
final Result result = sh.run(bin.shardTool + " -h");
assertThat(result.stdout, containsString("A CLI tool to remove corrupted parts of unrecoverable shards"));
};

// TODO: this should be checked on all distributions
if (distribution().isDefault()) {
Platforms.onLinux(action);
Platforms.onWindows(action);
}
}

public void test92ElasticsearchNodeCliPackaging() throws Exception {
@@ -354,12 +348,6 @@ public void test92ElasticsearchNodeCliPackaging() throws Exception {
final Result result = sh.run(bin.nodeTool + " -h");
assertThat(result.stdout, containsString("A CLI tool to do unsafe cluster and index manipulations on current node"));
};

// TODO: this should be checked on all distributions
if (distribution().isDefault()) {
Platforms.onLinux(action);
Platforms.onWindows(action);
}
}

public void test93ElasticsearchNodeCustomDataPathAndNotEsHomeWorkDir() throws Exception {
Original file line number Diff line number Diff line change
@@ -52,12 +52,5 @@ public void test06Dependencies() {
final Shell.Result result = sh.run("dpkg -I " + getDistributionFile(distribution()));

TestCase.assertTrue(Pattern.compile("(?m)^ Depends:.*bash.*").matcher(result.stdout).find());

String oppositePackageName = "elasticsearch";
if (distribution().isDefault()) {
oppositePackageName += "-oss";
}

TestCase.assertTrue(Pattern.compile("(?m)^ Conflicts: " + oppositePackageName + "$").matcher(result.stdout).find());
}
}
Original file line number Diff line number Diff line change
@@ -32,10 +32,8 @@
import static org.elasticsearch.packaging.util.Packages.assertInstalled;
import static org.elasticsearch.packaging.util.Packages.assertRemoved;
import static org.elasticsearch.packaging.util.Packages.installPackage;
import static org.elasticsearch.packaging.util.Packages.packageStatus;
import static org.elasticsearch.packaging.util.Packages.remove;
import static org.elasticsearch.packaging.util.Packages.verifyPackageInstallation;
import static org.hamcrest.core.Is.is;
import static org.junit.Assume.assumeTrue;

public class DebPreservationTests extends PackagingTestCase {
@@ -67,43 +65,14 @@ public void test20Remove() throws Exception {
installation.config(Paths.get("jvm.options.d", "heap.options"))
);

if (distribution().isDefault()) {
assertPathsExist(
installation.config,
installation.config("role_mapping.yml"),
installation.config("roles.yml"),
installation.config("users"),
installation.config("users_roles")
);
}

// keystore was removed

assertPathsDoNotExist(installation.config("elasticsearch.keystore"), installation.config(".elasticsearch.keystore.initial_md5sum"));

// doc files were removed

assertPathsDoNotExist(
Paths.get("/usr/share/doc/" + distribution().flavor.name),
Paths.get("/usr/share/doc/" + distribution().flavor.name + "/copyright")
);

// sysvinit service file was not removed
assertThat(SYSVINIT_SCRIPT, fileExists());

// defaults file was not removed
assertThat(installation.envFile, fileExists());
}

public void test30Purge() throws Exception {
append(installation.config(Paths.get("jvm.options.d", "heap.options")), "# foo");

sh.run("dpkg --purge " + distribution().flavor.name);

assertRemoved(distribution());

assertPathsDoNotExist(installation.config, installation.envFile, SYSVINIT_SCRIPT);

assertThat(packageStatus(distribution()).exitCode, is(1));
}
}
Loading