diff --git a/modules/local-caching-classloader/.gitignore b/modules/local-caching-classloader/.gitignore new file mode 100644 index 0000000..55d7f58 --- /dev/null +++ b/modules/local-caching-classloader/.gitignore @@ -0,0 +1,36 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# + +# Maven ignores +/target/ + +# IDE ignores +/.settings/ +/.project +/.classpath +/.pydevproject +/.idea +/*.iml +/*.ipr +/*.iws +/nbproject/ +/nbactions.xml +/nb-configuration.xml +/.vscode/ +/.factorypath diff --git a/modules/local-caching-classloader/README.md b/modules/local-caching-classloader/README.md new file mode 100644 index 0000000..f0f6d2e --- /dev/null +++ b/modules/local-caching-classloader/README.md @@ -0,0 +1,104 @@ + + +# Local Caching ClassLoader + +The LocalCachingContextClassLoaderFactory is an Accumulo ContextClassLoaderFactory implementation that creates and maintains a +LocalCachingContext. The `LocalCachingContextClassLoaderFactory.getClassLoader(String)` method expects the method +argument to be a valid `file`, `hdfs`, `http` or `https` URL to a context definition file. + +The context definition file is a JSON formatted file that contains the name of the context, the interval (in seconds) at which +the context definition file should be monitored, and a list of classpath resources. The LocalCachingContextClassLoaderFactory +creates the LocalCachingContext based on the initial contents of the context definition file, and updates the classloader +as changes are noticed based on the monitoring interval. An example of the context definition file is below. + +``` +{ + "contextName": "myContext", + "monitorIntervalSeconds": 5, + "resources": [ + { + "location": "file:/home/user/ClassLoaderTestA/TestA.jar", + "checksum": "a10883244d70d971ec25cbfa69b6f08f" + }, + { + "location": "hdfs://localhost:8020/contextB/TestB.jar", + "checksum": "a02a3b7026528156fb782dcdecaaa097" + }, + { + "location": "http://localhost:80/TestC.jar", + "checksum": "f464e66f6d07a41c656e8f4679509215" + } + ] +} +``` + +## Creating a ContextDefinition file + +Users may take advantage of the `ContextDefinition.create` method to construct a ContextDefinition object. This +will calculate the checksums of the classpath elements. `ContextDefinition.toJson` can be used to serialize the +ContextDefinition to a file. + +## Updating a ContextDefinition file + +The LocalCachingContextClassLoaderFactory uses a background thread to fetch the context definition file at the +specified interval. Users can change the context name, monitor interval, and list of resources. Changes to the +context name are ignored however as the context cache directory is created using the context name upon initial +creation. The LocalCachingContextClassLoaderFactory will schedule the next download the of the context +definition file based on the updated monitor interval, and if the list of resources have changed, then they will +be downloaded, verified against their checksums, and used to construct a new ClassLoader for the context. + +## Local Caching + +The property `general.custom.classloader.lcc.cache.dir` is required to be set to a local directory on the host. The +LocalCachingContext creates a directory at this location for each named context. Each context cache directory +contains a lock file and a copy of each fetched resource that is named in the context definition file using the format: +`fileName_checksum`. The lock file is used with Java's `FileChannel.tryLock` to enable exclusive access (on supported +platforms) to the directory from different processes on the same host. + +## Error Handling + +If there is an exception in creating the initial classloader, then a ContextClassLoaderException is thrown. If there is +an exception when updating the classloader, then the exception is logged and the classloader is not updated. Calls +to `LocalCachingContextClassLoaderFactory.getClassLoader(String)` will return the most recent classloader +with valid contents. If the checksum of a downloaded resource does not match the checksum in the context definition +file, then the downloaded version of the file is deleted from the context cache directory so that it can be retried +at the next interval. + +The property `general.custom.classloader.lcc.update.grace.minutes` determines how long the update process +continues to return the most recent valid classloader when an exception occurs in the background update thread. +A zero value (default) will cause the most recent valid classloader to be returned. Otherwise, the update thread +will fail for N minutes, then clear the reference to the classloader internally. This will cause a subsequent +call to `LocalCachingContextClassLoaderFactory.getClassLoader(String)` to act like the initial call to +create the classloader and return the exception to the calling code. + +## Cleanup + +Because the cache directory is shared among multiple processes, and one process can't know what the other processes are doing, +this class cannot clean up the shared cache directory. It is left to the user to remove unused context cache directories +and unused old files within a context cache directory. + +## Accumulo Configuration + +To use this with Accumulo: + + 1. Set the following Accumulo site properties: `general.context.class.loader.factory=org.apache.accumulo.classloader.lcc.LocalCachingContextClassLoaderFactory` +`general.custom.classloader.lcc.cache.dir=file://path/to/some/directory` + + 2. Set the following table property: `table.class.loader.context=(file|hdfs|http|https)://path/to/context/definition.json` + + diff --git a/modules/local-caching-classloader/pom.xml b/modules/local-caching-classloader/pom.xml new file mode 100644 index 0000000..8b39c34 --- /dev/null +++ b/modules/local-caching-classloader/pom.xml @@ -0,0 +1,227 @@ + + + + 4.0.0 + + org.apache.accumulo + classloader-extras + 1.0.0-SNAPSHOT + ../../pom.xml + + local-caching-classloader + classloader-extras-local-caching + + --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.io=ALL-UNNAMED --add-opens java.base/java.net=ALL-UNNAMED --add-opens java.management/java.lang.management=ALL-UNNAMED --add-opens java.management/sun.management=ALL-UNNAMED --add-opens java.base/java.security=ALL-UNNAMED --add-opens java.base/java.lang.reflect=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.util.stream=ALL-UNNAMED --add-opens java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens java.base/java.time=ALL-UNNAMED + ../../src/build/eclipse-codestyle.xml + + false + 1 + + false + true + + false + 1 + + false + + + + + com.github.spotbugs + spotbugs-annotations + true + + + com.github.ben-manes.caffeine + caffeine + provided + + + com.google.code.gson + gson + provided + + + com.google.guava + guava + provided + + + commons-codec + commons-codec + provided + + + commons-io + commons-io + provided + + + + org.apache.accumulo + accumulo-core + provided + + + org.apache.hadoop + hadoop-client-api + provided + + + + org.slf4j + slf4j-api + provided + + + org.apache.accumulo + accumulo-minicluster + test + + + org.apache.accumulo + accumulo-test + test + + + org.apache.hadoop + hadoop-client-minicluster + test + + + org.apache.logging.log4j + log4j-slf4j2-impl + test + + + org.eclipse.jetty + jetty-server + test + + + org.eclipse.jetty + jetty-util + test + + + org.junit.jupiter + junit-jupiter-api + test + + + + + + org.apache.maven.plugins + maven-dependency-plugin + + + copy-test-jars + + copy + + process-test-resources + + + + org.apache.accumulo + example-iterators-a + ${project.version} + jar + true + ${project.build.directory}/test-classes/ExampleIteratorsA + example-iterators-a.jar + + + org.apache.accumulo + example-iterators-b + ${project.version} + jar + true + ${project.build.directory}/test-classes/ExampleIteratorsB + example-iterators-b.jar + + + + + + + + org.apache.maven.plugins + maven-surefire-plugin + + ${surefire.forkCount} + ${surefire.reuseForks} + ${surefire.excludedGroups} + ${surefire.groups} + + ${project.build.directory} + + ${accumulo.build.extraTestArgs} + + + + org.apache.maven.plugins + maven-failsafe-plugin + + ${failsafe.forkCount} + ${failsafe.reuseForks} + ${failsafe.excludedGroups} + ${failsafe.groups} + + ${accumulo.it.uniq.test.dir} + ${project.build.directory} + + ${accumulo.build.extraTestArgs} + false + + + + org.codehaus.mojo + exec-maven-plugin + + + build-test-jars + + exec + + process-test-classes + + src/test/shell/makeTestJars.sh + + + + build-helloworld-jars + + exec + + process-test-classes + + src/test/shell/makeHelloWorldJars.sh + + + + + + + diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/Constants.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/Constants.java new file mode 100644 index 0000000..de0a49f --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/Constants.java @@ -0,0 +1,45 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import java.util.concurrent.Executors; +import java.util.concurrent.ScheduledExecutorService; + +import org.apache.accumulo.core.conf.Property; +import org.apache.commons.codec.digest.DigestUtils; + +import com.google.gson.Gson; +import com.google.gson.GsonBuilder; + +public class Constants { + + public static final String CACHE_DIR_PROPERTY = + Property.GENERAL_ARBITRARY_PROP_PREFIX.getKey() + "classloader.lcc.cache.dir"; + + public static final String UPDATE_FAILURE_GRACE_PERIOD_MINS = + Property.GENERAL_ARBITRARY_PROP_PREFIX.getKey() + "classloader.lcc.update.grace.minutes"; + + public static final ScheduledExecutorService EXECUTOR = Executors.newScheduledThreadPool(0); + public static final Gson GSON = new GsonBuilder().disableJdkUnsafe().create(); + + public static DigestUtils getChecksummer() { + return new DigestUtils("SHA256"); + } + +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContext.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContext.java new file mode 100644 index 0000000..43184f4 --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContext.java @@ -0,0 +1,270 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static java.nio.file.StandardCopyOption.ATOMIC_MOVE; +import static java.nio.file.StandardCopyOption.REPLACE_EXISTING; +import static java.util.Objects.hash; +import static java.util.Objects.requireNonNull; + +import java.io.File; +import java.io.IOException; +import java.io.InputStream; +import java.net.URISyntaxException; +import java.net.URL; +import java.net.URLClassLoader; +import java.nio.file.Files; +import java.nio.file.Path; +import java.security.AccessController; +import java.security.PrivilegedAction; +import java.util.Arrays; +import java.util.HashSet; +import java.util.Iterator; +import java.util.Objects; +import java.util.Set; +import java.util.concurrent.TimeUnit; +import java.util.concurrent.atomic.AtomicReference; + +import org.apache.accumulo.classloader.lcc.cache.CacheUtils; +import org.apache.accumulo.classloader.lcc.cache.CacheUtils.LockInfo; +import org.apache.accumulo.classloader.lcc.definition.ContextDefinition; +import org.apache.accumulo.classloader.lcc.definition.Resource; +import org.apache.accumulo.classloader.lcc.resolvers.FileResolver; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; +import org.apache.accumulo.core.util.Retry; +import org.apache.accumulo.core.util.Retry.RetryFactory; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public final class LocalCachingContext { + + private static class ClassPathElement { + private final FileResolver resolver; + private final URL localCachedCopyLocation; + private final String localCachedCopyDigest; + + public ClassPathElement(FileResolver resolver, URL localCachedCopy, + String localCachedCopyDigest) { + this.resolver = requireNonNull(resolver, "resolver must be supplied"); + this.localCachedCopyLocation = + requireNonNull(localCachedCopy, "local cached copy location must be supplied"); + this.localCachedCopyDigest = + requireNonNull(localCachedCopyDigest, "local cached copy md5 must be supplied"); + } + + public URL getLocalCachedCopyLocation() { + return localCachedCopyLocation; + } + + @Override + public int hashCode() { + return hash(localCachedCopyDigest, localCachedCopyLocation, resolver); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) { + return true; + } + if (obj == null) { + return false; + } + if (getClass() != obj.getClass()) { + return false; + } + ClassPathElement other = (ClassPathElement) obj; + return Objects.equals(localCachedCopyDigest, other.localCachedCopyDigest) + && Objects.equals(localCachedCopyLocation, other.localCachedCopyLocation) + && Objects.equals(resolver, other.resolver); + } + + @Override + public String toString() { + StringBuilder buf = new StringBuilder(); + buf.append("source: ").append(resolver.getURL()); + buf.append(", cached copy:").append(localCachedCopyLocation); + return buf.toString(); + } + } + + private static final Logger LOG = LoggerFactory.getLogger(LocalCachingContext.class); + + private final Path contextCacheDir; + private final String contextName; + private final Set elements = new HashSet<>(); + private final AtomicReference classloader = new AtomicReference<>(); + private final AtomicReference definition = new AtomicReference<>(); + private final RetryFactory retryFactory = Retry.builder().infiniteRetries() + .retryAfter(1, TimeUnit.SECONDS).incrementBy(1, TimeUnit.SECONDS).maxWait(5, TimeUnit.MINUTES) + .backOffFactor(2).logInterval(1, TimeUnit.SECONDS).createFactory(); + + public LocalCachingContext(final String baseCacheDir, final ContextDefinition contextDefinition) + throws IOException, ContextClassLoaderException { + this.definition.set(requireNonNull(contextDefinition, "definition must be supplied")); + this.contextName = this.definition.get().getContextName(); + this.contextCacheDir = CacheUtils.createOrGetContextCacheDir(baseCacheDir, contextName); + } + + public ContextDefinition getDefinition() { + return definition.get(); + } + + private ClassPathElement cacheResource(final Resource resource) + throws InterruptedException, IOException, ContextClassLoaderException, URISyntaxException { + final FileResolver source = FileResolver.resolve(resource.getURL()); + final Path tmpCacheLocation = + contextCacheDir.resolve(source.getFileName() + "_" + resource.getChecksum() + "_tmp"); + final Path finalCacheLocation = + contextCacheDir.resolve(source.getFileName() + "_" + resource.getChecksum()); + final File cacheFile = finalCacheLocation.toFile(); + if (!Files.exists(finalCacheLocation)) { + Retry retry = retryFactory.createRetry(); + boolean successful = false; + while (!successful) { + LOG.trace("Caching resource {} at {}", source.getURL(), cacheFile.getAbsolutePath()); + try (InputStream is = source.getInputStream()) { + Files.copy(is, tmpCacheLocation, REPLACE_EXISTING); + Files.move(tmpCacheLocation, finalCacheLocation, ATOMIC_MOVE); + successful = true; + retry.logCompletion(LOG, + "Resource " + source.getURL() + " cached locally as " + finalCacheLocation); + } catch (IOException e) { + LOG.error("Error copying resource from {} to {}. Retrying...", source.getURL(), + finalCacheLocation, e); + retry.logRetry(LOG, "Unable to cache resource " + source.getURL()); + retry.waitForNextAttempt(LOG, "Cache resource " + source.getURL()); + } finally { + retry.useRetry(); + } + } + final String checksum = Constants.getChecksummer().digestAsHex(cacheFile); + if (!resource.getChecksum().equals(checksum)) { + LOG.error( + "Checksum {} for resource {} does not match checksum in context definition {}, removing cached copy.", + checksum, source.getURL(), resource.getChecksum()); + Files.delete(finalCacheLocation); + throw new IllegalStateException("Checksum " + checksum + " for resource " + source.getURL() + + " does not match checksum in context definition " + resource.getChecksum()); + } + return new ClassPathElement(source, cacheFile.toURI().toURL(), checksum); + } else { + // File exists, return new ClassPathElement based on existing file + LOG.trace("Resource {} is already cached at {}", source.getURL(), + cacheFile.getAbsolutePath()); + return new ClassPathElement(source, cacheFile.toURI().toURL(), resource.getChecksum()); + } + } + + private void cacheResources(final ContextDefinition def) + throws InterruptedException, IOException, ContextClassLoaderException, URISyntaxException { + synchronized (elements) { + for (Resource updatedResource : def.getResources()) { + ClassPathElement cpe = cacheResource(updatedResource); + elements.add(cpe); + LOG.trace("Added element {} to classpath", cpe); + } + classloader.set(null); + } + } + + public void initialize() + throws InterruptedException, IOException, ContextClassLoaderException, URISyntaxException { + try { + LockInfo lockInfo = CacheUtils.lockContextCacheDir(contextCacheDir); + while (lockInfo == null) { + // something else is updating this directory + LOG.info("Directory {} locked, another process must be updating the class loader contents. " + + "Retrying in 1 second", contextCacheDir); + Thread.sleep(1000); + lockInfo = CacheUtils.lockContextCacheDir(contextCacheDir); + } + synchronized (elements) { + try { + cacheResources(definition.get()); + } finally { + lockInfo.unlock(); + } + } + } catch (Exception e) { + LOG.error("Error initializing context: " + contextName, e); + throw e; + } + } + + public void update(final ContextDefinition update) + throws InterruptedException, IOException, ContextClassLoaderException, URISyntaxException { + requireNonNull(update, "definition must be supplied"); + if (definition.get().getResources().equals(update.getResources())) { + return; + } + try { + LockInfo lockInfo = CacheUtils.lockContextCacheDir(contextCacheDir); + while (lockInfo == null) { + // something else is updating this directory + LOG.info("Directory {} locked, another process must be updating the class loader contents. " + + "Retrying in 1 second", contextCacheDir); + Thread.sleep(1000); + lockInfo = CacheUtils.lockContextCacheDir(contextCacheDir); + } + synchronized (elements) { + try { + elements.clear(); + cacheResources(update); + this.definition.set(update); + } finally { + lockInfo.unlock(); + } + } + } catch (Exception e) { + LOG.error("Error updating context: " + contextName, e); + throw e; + } + } + + public ClassLoader getClassloader() { + + ClassLoader currentCL = classloader.get(); + if (currentCL != null) { + return currentCL; + } + + synchronized (elements) { + + currentCL = classloader.get(); + if (currentCL != null) { + return currentCL; + } + + LOG.trace("Class path contents have changed, creating new classloader"); + URL[] urls = new URL[elements.size()]; + Iterator iter = elements.iterator(); + for (int x = 0; x < elements.size(); x++) { + urls[x] = iter.next().getLocalCachedCopyLocation(); + } + final URLClassLoader cl = + AccessController.doPrivileged((PrivilegedAction) () -> { + return new URLClassLoader(contextName, urls, this.getClass().getClassLoader()); + }); + classloader.set(cl); + LOG.trace("New classloader created from URLs: {}", + Arrays.asList(classloader.get().getURLs())); + return cl; + } + } +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactory.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactory.java new file mode 100644 index 0000000..14e7c32 --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactory.java @@ -0,0 +1,244 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static java.nio.charset.StandardCharsets.UTF_8; +import static java.util.Objects.requireNonNull; + +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.net.MalformedURLException; +import java.net.URISyntaxException; +import java.net.URL; +import java.security.NoSuchAlgorithmException; +import java.time.Duration; +import java.util.Arrays; +import java.util.HashMap; +import java.util.Map; +import java.util.concurrent.TimeUnit; +import java.util.concurrent.atomic.AtomicBoolean; + +import org.apache.accumulo.classloader.lcc.cache.CacheUtils; +import org.apache.accumulo.classloader.lcc.definition.ContextDefinition; +import org.apache.accumulo.classloader.lcc.definition.Resource; +import org.apache.accumulo.classloader.lcc.resolvers.FileResolver; +import org.apache.accumulo.core.spi.common.ContextClassLoaderEnvironment; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory; +import org.apache.accumulo.core.util.Timer; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import com.github.benmanes.caffeine.cache.Cache; +import com.github.benmanes.caffeine.cache.Caffeine; +import com.google.common.base.Preconditions; + +/** + * A ContextClassLoaderFactory implementation that creates and maintains a ClassLoader for a named + * context. This factory expects the parameter passed to {@link #getClassLoader(String)} to be the + * URL of a json formatted {@link ContextDefinition} file. The file contains an interval at which + * this class should monitor the file for changes and a list of {@link Resource} objects. Each + * resource is defined by a URL to the file and an expected MD5 hash value. + *

+ * The URLs supplied for the context definition file and for the resources can use one of the + * following protocols: file://, http://, or hdfs://. + *

+ * As this class processes the ContextDefinition it fetches the contents of the resource from the + * resource URL and caches it in a directory on the local filesystem. This class uses the value of + * the property {@link Constants#CACHE_DIR_PROPERTY} passed via + * {@link #init(ContextClassLoaderEnvironment)} as the root directory and creates a sub-directory + * for each context name. Each context cache directory contains a lock file and a copy of each + * fetched resource that is named using the following format: fileName_checksum. + *

+ * The lock file prevents processes from manipulating the contexts of the context cache directory + * concurrently, which enables the cache directories to be shared among multiple processes on the + * host. + *

+ * Note that because the cache directory is shared among multiple processes, and one process can't + * know what the other processes are doing, this class cannot clean up the shared cache directory. + * It is left to the user to remove unused context cache directories and unused old files within a + * context cache directory. + */ +public class LocalCachingContextClassLoaderFactory implements ContextClassLoaderFactory { + + private static final Logger LOG = + LoggerFactory.getLogger(LocalCachingContextClassLoaderFactory.class); + + private final Cache contexts = + Caffeine.newBuilder().expireAfterAccess(24, TimeUnit.HOURS).build(); + + private final Map classloaderFailures = new HashMap<>(); + private volatile String baseCacheDir; + private volatile Duration updateFailureGracePeriodMins; + + private ContextDefinition parseContextDefinition(final URL url) + throws ContextClassLoaderException { + LOG.trace("Retrieving context definition file from {}", url); + final FileResolver resolver = FileResolver.resolve(url); + try { + try (InputStream is = resolver.getInputStream()) { + ContextDefinition def = + Constants.GSON.fromJson(new InputStreamReader(is, UTF_8), ContextDefinition.class); + if (def == null) { + throw new ContextClassLoaderException( + "ContextDefinition null for context definition file: " + resolver.getURL()); + } + return def; + } + } catch (IOException e) { + throw new ContextClassLoaderException( + "Error reading context definition file: " + resolver.getURL(), e); + } + } + + /** + * Schedule a task to execute at {@code interval} seconds to update the LocalCachingContext if the + * ContextDefinition has changed. The task schedules a follow-on task at the update interval value + * (if it changed). + */ + private void monitorContext(final String contextLocation, int interval) { + Constants.EXECUTOR.schedule(() -> { + final LocalCachingContext classLoader = + contexts.policy().getIfPresentQuietly(contextLocation); + if (classLoader == null) { + // context has been removed from the map, no need to check for update + LOG.debug("ClassLoader for context {} not present, no longer monitoring for changes", + contextLocation); + return; + } + int nextInterval = interval; + final ContextDefinition currentDef = classLoader.getDefinition(); + try { + final URL contextLocationUrl = new URL(contextLocation); + final ContextDefinition update = parseContextDefinition(contextLocationUrl); + if (!Arrays.equals(currentDef.getChecksum(), update.getChecksum())) { + LOG.debug("Context definition for {} has changed", contextLocation); + if (!currentDef.getContextName().equals(update.getContextName())) { + LOG.warn( + "Context name changed for context {}, but context cache directory will remain {} (old={}, new={})", + contextLocation, currentDef.getContextName(), currentDef.getContextName(), + update.getContextName()); + } + classLoader.update(update); + nextInterval = update.getMonitorIntervalSeconds(); + classloaderFailures.remove(contextLocation); + } else { + LOG.trace("Context definition for {} has not changed", contextLocation); + } + } catch (ContextClassLoaderException | InterruptedException | IOException + | NoSuchAlgorithmException | URISyntaxException | RuntimeException e) { + LOG.error("Error parsing updated context definition at {}. Classloader NOT updated!", + contextLocation, e); + final Timer failureTimer = classloaderFailures.get(contextLocation); + if (updateFailureGracePeriodMins.isZero()) { + // failure monitoring is disabled + LOG.debug("Property {} not set, not tracking classloader failures for context {}", + Constants.UPDATE_FAILURE_GRACE_PERIOD_MINS, contextLocation); + } else if (failureTimer == null) { + // first failure, start the timer + classloaderFailures.put(contextLocation, Timer.startNew()); + LOG.debug( + "Tracking classloader failures for context {}, will NOT return working classloader if failures continue for {} minutes", + contextLocation, updateFailureGracePeriodMins.toMinutes()); + } else if (failureTimer.hasElapsed(updateFailureGracePeriodMins)) { + // has been failing for the grace period + // unset the classloader reference so that the failure + // will return from getClassLoader in the calling thread + LOG.info("Grace period for failing classloader has elapsed for context {}", + contextLocation); + contexts.invalidate(contextLocation); + classloaderFailures.remove(contextLocation); + } else { + LOG.trace("Failing to update classloader for context {} within the grace period", + contextLocation, e); + } + } finally { + monitorContext(contextLocation, nextInterval); + } + }, interval, TimeUnit.SECONDS); + LOG.trace("Monitoring context definition file {} for changes at {} second intervals", + contextLocation, interval); + } + + // for tests only + void resetForTests() { + // Removing the contexts will cause the + // background monitor task to end + contexts.invalidateAll(); + contexts.cleanUp(); + } + + @Override + public void init(ContextClassLoaderEnvironment env) { + baseCacheDir = requireNonNull(env.getConfiguration().get(Constants.CACHE_DIR_PROPERTY), + "Property " + Constants.CACHE_DIR_PROPERTY + " not set, cannot create cache directory."); + String graceProp = env.getConfiguration().get(Constants.UPDATE_FAILURE_GRACE_PERIOD_MINS); + long graceMins = graceProp == null ? 0 : Long.parseLong(graceProp); + updateFailureGracePeriodMins = Duration.ofMinutes(graceMins); + try { + CacheUtils.createBaseCacheDir(baseCacheDir); + } catch (IOException | ContextClassLoaderException e) { + throw new IllegalStateException("Error creating base cache directory at " + baseCacheDir, e); + } + } + + @Override + public ClassLoader getClassLoader(final String contextLocation) + throws ContextClassLoaderException { + Preconditions.checkState(baseCacheDir != null, "init not called before calling getClassLoader"); + requireNonNull(contextLocation, "context name must be supplied"); + try { + final URL contextLocationUrl = new URL(contextLocation); + final AtomicBoolean newlyCreated = new AtomicBoolean(false); + final LocalCachingContext ccl = contexts.get(contextLocation, cn -> { + try { + ContextDefinition def = parseContextDefinition(contextLocationUrl); + LocalCachingContext newCcl = new LocalCachingContext(baseCacheDir, def); + newCcl.initialize(); + newlyCreated.set(true); + return newCcl; + } catch (Exception e) { + throw new RuntimeException("Error creating context classloader", e); + } + }); + if (newlyCreated.get()) { + monitorContext(contextLocation, ccl.getDefinition().getMonitorIntervalSeconds()); + } + return ccl.getClassloader(); + } catch (MalformedURLException e) { + throw new ContextClassLoaderException( + "Expected valid URL to context definition file but received: " + contextLocation, e); + } catch (RuntimeException re) { + Throwable t = re.getCause(); + if (t != null && t instanceof InterruptedException) { + Thread.currentThread().interrupt(); + } + if (t != null) { + if (t instanceof ContextClassLoaderException) { + throw (ContextClassLoaderException) t; + } else { + throw new ContextClassLoaderException(t.getMessage(), t); + } + } else { + throw new ContextClassLoaderException(re.getMessage(), re); + } + } + } + +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/cache/CacheUtils.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/cache/CacheUtils.java new file mode 100644 index 0000000..e50cbbc --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/cache/CacheUtils.java @@ -0,0 +1,129 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.cache; + +import static java.nio.file.attribute.PosixFilePermission.OWNER_EXECUTE; +import static java.nio.file.attribute.PosixFilePermission.OWNER_READ; +import static java.nio.file.attribute.PosixFilePermission.OWNER_WRITE; +import static java.util.Objects.requireNonNull; + +import java.io.IOException; +import java.net.URI; +import java.nio.channels.FileChannel; +import java.nio.channels.FileLock; +import java.nio.channels.OverlappingFileLockException; +import java.nio.file.FileAlreadyExistsException; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.StandardOpenOption; +import java.nio.file.attribute.FileAttribute; +import java.nio.file.attribute.PosixFilePermission; +import java.nio.file.attribute.PosixFilePermissions; +import java.util.EnumSet; +import java.util.Set; + +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; + +public class CacheUtils { + + private static final Set CACHE_DIR_PERMS = + EnumSet.of(OWNER_READ, OWNER_WRITE, OWNER_EXECUTE); + private static final FileAttribute> PERMISSIONS = + PosixFilePermissions.asFileAttribute(CACHE_DIR_PERMS); + private static final String lockFileName = "lock_file"; + + public static class LockInfo { + + private final FileChannel channel; + private final FileLock lock; + + public LockInfo(FileChannel channel, FileLock lock) { + this.channel = requireNonNull(channel, "channel must be supplied"); + this.lock = requireNonNull(lock, "lock must be supplied"); + } + + FileChannel getChannel() { + return channel; + } + + FileLock getLock() { + return lock; + } + + public void unlock() throws IOException { + lock.release(); + channel.close(); + } + + } + + private static Path mkdir(final Path p) throws IOException { + try { + return Files.createDirectory(p, PERMISSIONS); + } catch (FileAlreadyExistsException e) { + return p; + } + } + + public static Path createBaseCacheDir(final String baseCacheDir) + throws IOException, ContextClassLoaderException { + if (baseCacheDir == null) { + throw new ContextClassLoaderException("received null for cache directory"); + } + return mkdir(Path.of(URI.create(baseCacheDir))); + } + + public static Path createOrGetContextCacheDir(final String baseCacheDir, final String contextName) + throws IOException, ContextClassLoaderException { + Path baseContextDir = createBaseCacheDir(baseCacheDir); + return mkdir(baseContextDir.resolve(contextName)); + } + + /** + * Acquire an exclusive lock on the "lock_file" file in the context cache directory. Returns null + * if lock can not be acquired. Caller MUST call LockInfo.unlock when done manipulating the cache + * directory + */ + public static LockInfo lockContextCacheDir(final Path contextCacheDir) + throws ContextClassLoaderException { + final Path lockFilePath = contextCacheDir.resolve(lockFileName); + try { + final FileChannel channel = FileChannel.open(lockFilePath, + EnumSet.of(StandardOpenOption.CREATE, StandardOpenOption.WRITE), PERMISSIONS); + try { + final FileLock lock = channel.tryLock(); + if (lock == null) { + // something else has the lock + channel.close(); + return null; + } else { + return new LockInfo(channel, lock); + } + } catch (OverlappingFileLockException e) { + // something else has the lock + channel.close(); + return null; + } + } catch (IOException e) { + throw new ContextClassLoaderException("Error creating lock file in context cache directory " + + contextCacheDir.toFile().getAbsolutePath(), e); + } + } + +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/ContextDefinition.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/ContextDefinition.java new file mode 100644 index 0000000..d1d180c --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/ContextDefinition.java @@ -0,0 +1,127 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.definition; + +import static java.util.Objects.hash; +import static java.util.Objects.requireNonNull; + +import java.io.IOException; +import java.io.InputStream; +import java.net.URL; +import java.security.NoSuchAlgorithmException; +import java.util.Objects; +import java.util.TreeSet; + +import org.apache.accumulo.classloader.lcc.Constants; +import org.apache.accumulo.classloader.lcc.resolvers.FileResolver; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; + +import com.google.common.base.Preconditions; + +import edu.umd.cs.findbugs.annotations.SuppressFBWarnings; + +@SuppressFBWarnings(value = {"EI_EXPOSE_REP"}) +public class ContextDefinition { + + public static ContextDefinition create(String contextName, int monitorIntervalSecs, + URL... sources) throws ContextClassLoaderException, IOException { + TreeSet resources = new TreeSet<>(); + for (URL u : sources) { + FileResolver resolver = FileResolver.resolve(u); + try (InputStream is = resolver.getInputStream()) { + String checksum = Constants.getChecksummer().digestAsHex(is); + resources.add(new Resource(u.toString(), checksum)); + } + } + return new ContextDefinition(contextName, monitorIntervalSecs, resources); + } + + private String contextName; + private volatile int monitorIntervalSeconds; + private TreeSet resources; + private volatile transient byte[] checksum = null; + + public ContextDefinition() {} + + public ContextDefinition(String contextName, int monitorIntervalSeconds, + TreeSet resources) { + this.contextName = requireNonNull(contextName, "context name must be supplied"); + Preconditions.checkArgument(monitorIntervalSeconds > 0, + "monitor interval must be greater than zero"); + this.monitorIntervalSeconds = monitorIntervalSeconds; + this.resources = requireNonNull(resources, "resources must be supplied"); + } + + public String getContextName() { + return contextName; + } + + public int getMonitorIntervalSeconds() { + return monitorIntervalSeconds; + } + + public TreeSet getResources() { + return resources; + } + + public void setContextName(String contextName) { + this.contextName = contextName; + } + + public void setMonitorIntervalSeconds(int monitorIntervalSeconds) { + this.monitorIntervalSeconds = monitorIntervalSeconds; + } + + public void setResources(TreeSet resources) { + this.resources = resources; + } + + @Override + public int hashCode() { + return hash(contextName, monitorIntervalSeconds, resources); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) { + return true; + } + if (obj == null) { + return false; + } + if (getClass() != obj.getClass()) { + return false; + } + ContextDefinition other = (ContextDefinition) obj; + return Objects.equals(contextName, other.contextName) + && monitorIntervalSeconds == other.monitorIntervalSeconds + && Objects.equals(resources, other.resources); + } + + public synchronized byte[] getChecksum() throws NoSuchAlgorithmException { + if (checksum == null) { + checksum = Constants.getChecksummer().digest(toJson()); + } + return checksum; + } + + public String toJson() { + return Constants.GSON.toJson(this); + } +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/Resource.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/Resource.java new file mode 100644 index 0000000..119824e --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/definition/Resource.java @@ -0,0 +1,85 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.definition; + +import java.net.MalformedURLException; +import java.net.URL; +import java.util.Objects; + +public class Resource implements Comparable { + + private String location; + private String checksum; + + public Resource() {} + + public Resource(String location, String checksum) { + this.location = location; + this.checksum = checksum; + } + + public String getLocation() { + return location; + } + + public String getChecksum() { + return checksum; + } + + public void setLocation(String location) { + this.location = location; + } + + public void setChecksum(String checksum) { + this.checksum = checksum; + } + + public URL getURL() throws MalformedURLException { + return new URL(location); + } + + @Override + public int hashCode() { + return Objects.hash(checksum, location); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) { + return true; + } + if (obj == null) { + return false; + } + if (getClass() != obj.getClass()) { + return false; + } + Resource other = (Resource) obj; + return Objects.equals(checksum, other.checksum) && Objects.equals(location, other.location); + } + + @Override + public int compareTo(Resource other) { + int result = this.location.compareTo(other.location); + if (result == 0) { + return this.checksum.compareTo(other.checksum); + } + return result; + } +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolver.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolver.java new file mode 100644 index 0000000..88014df --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolver.java @@ -0,0 +1,82 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.resolvers; + +import static java.util.Objects.hash; +import static java.util.Objects.requireNonNull; + +import java.io.IOException; +import java.io.InputStream; +import java.net.URL; +import java.util.Objects; + +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; + +public abstract class FileResolver { + + public static FileResolver resolve(URL url) throws ContextClassLoaderException { + requireNonNull(url, "URL must be supplied"); + switch (url.getProtocol()) { + case "http": + case "https": + return new HttpFileResolver(url); + case "file": + return new LocalFileResolver(url); + case "hdfs": + return new HdfsFileResolver(url); + default: + throw new ContextClassLoaderException("Unhandled protocol: " + url.getProtocol()); + } + } + + private final URL url; + + protected FileResolver(URL url) throws ContextClassLoaderException { + this.url = url; + } + + public URL getURL() { + return this.url; + } + + public abstract String getFileName(); + + public abstract InputStream getInputStream() throws IOException; + + @Override + public int hashCode() { + return hash(url); + } + + @Override + public boolean equals(Object obj) { + if (this == obj) { + return true; + } + if (obj == null) { + return false; + } + if (getClass() != obj.getClass()) { + return false; + } + FileResolver other = (FileResolver) obj; + return Objects.equals(url, other.url); + } + +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HdfsFileResolver.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HdfsFileResolver.java new file mode 100644 index 0000000..f163ba1 --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HdfsFileResolver.java @@ -0,0 +1,63 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.resolvers; + +import java.io.IOException; +import java.io.InputStream; +import java.net.URI; +import java.net.URISyntaxException; +import java.net.URL; + +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; + +public final class HdfsFileResolver extends FileResolver { + + private final Configuration hadoopConf = new Configuration(); + private final FileSystem fs; + private final Path path; + + protected HdfsFileResolver(URL url) throws ContextClassLoaderException { + super(url); + try { + final URI uri = url.toURI(); + this.fs = FileSystem.get(uri, hadoopConf); + this.path = fs.makeQualified(new Path(uri)); + if (!fs.exists(this.path)) { + throw new ContextClassLoaderException("File: " + url + " does not exist."); + } + } catch (URISyntaxException e) { + throw new ContextClassLoaderException("Error creating URI from url: " + url, e); + } catch (IOException e) { + throw new ContextClassLoaderException("Error resolving file from url: " + url, e); + } + } + + @Override + public String getFileName() { + return this.path.getName(); + } + + @Override + public InputStream getInputStream() throws IOException { + return fs.open(path); + } +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HttpFileResolver.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HttpFileResolver.java new file mode 100644 index 0000000..76f8442 --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/HttpFileResolver.java @@ -0,0 +1,46 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.resolvers; + +import java.io.IOException; +import java.io.InputStream; +import java.net.URL; + +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; + +import edu.umd.cs.findbugs.annotations.SuppressFBWarnings; + +public final class HttpFileResolver extends FileResolver { + + protected HttpFileResolver(URL url) throws ContextClassLoaderException { + super(url); + } + + @Override + public String getFileName() { + String path = getURL().getPath(); + return path.substring(path.lastIndexOf("/") + 1); + } + + @Override + @SuppressFBWarnings(value = "URLCONNECTION_SSRF_FD") + public InputStream getInputStream() throws IOException { + return getURL().openStream(); + } +} diff --git a/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/LocalFileResolver.java b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/LocalFileResolver.java new file mode 100644 index 0000000..cc105b4 --- /dev/null +++ b/modules/local-caching-classloader/src/main/java/org/apache/accumulo/classloader/lcc/resolvers/LocalFileResolver.java @@ -0,0 +1,64 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.resolvers; + +import java.io.BufferedInputStream; +import java.io.File; +import java.io.IOException; +import java.io.InputStream; +import java.net.URI; +import java.net.URISyntaxException; +import java.net.URL; +import java.nio.file.Files; +import java.nio.file.Path; + +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; + +public final class LocalFileResolver extends FileResolver { + + private final File file; + + public LocalFileResolver(URL url) throws ContextClassLoaderException { + super(url); + if (url.getHost() != null && !url.getHost().isBlank()) { + throw new ContextClassLoaderException( + "Unsupported file url, only local files are supported. host = " + url.getHost()); + } + try { + final URI uri = url.toURI(); + final Path path = Path.of(uri); + if (Files.notExists(Path.of(uri))) { + throw new ContextClassLoaderException("File: " + url + " does not exist."); + } + file = path.toFile(); + } catch (URISyntaxException e) { + throw new ContextClassLoaderException("Error creating URI from url: " + url, e); + } + } + + @Override + public String getFileName() { + return file.getName(); + } + + @Override + public InputStream getInputStream() throws IOException { + return new BufferedInputStream(Files.newInputStream(file.toPath())); + } +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactoryTest.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactoryTest.java new file mode 100644 index 0000000..587de84 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextClassLoaderFactoryTest.java @@ -0,0 +1,737 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static org.apache.accumulo.classloader.lcc.TestUtils.createContextDefinitionFile; +import static org.apache.accumulo.classloader.lcc.TestUtils.testClassFailsToLoad; +import static org.apache.accumulo.classloader.lcc.TestUtils.testClassLoads; +import static org.apache.accumulo.classloader.lcc.TestUtils.updateContextDefinitionFile; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNotEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; + +import java.io.File; +import java.net.MalformedURLException; +import java.net.URL; +import java.nio.file.Files; +import java.nio.file.StandardCopyOption; +import java.nio.file.StandardOpenOption; +import java.util.ArrayList; +import java.util.Collections; +import java.util.List; +import java.util.Map; +import java.util.TreeSet; + +import org.apache.accumulo.classloader.lcc.TestUtils.TestClassInfo; +import org.apache.accumulo.classloader.lcc.definition.ContextDefinition; +import org.apache.accumulo.classloader.lcc.definition.Resource; +import org.apache.accumulo.core.conf.ConfigurationCopy; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; +import org.apache.accumulo.core.util.ConfigurationImpl; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FsUrlStreamHandlerFactory; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.hdfs.MiniDFSCluster; +import org.eclipse.jetty.server.Server; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.io.TempDir; + +public class LocalCachingContextClassLoaderFactoryTest { + + private static final LocalCachingContextClassLoaderFactory FACTORY = + new LocalCachingContextClassLoaderFactory(); + protected static final int MONITOR_INTERVAL_SECS = 5; + private static MiniDFSCluster hdfs; + private static FileSystem fs; + private static Server jetty; + private static URL jarAOrigLocation; + private static URL jarBOrigLocation; + private static URL jarCOrigLocation; + private static URL jarDOrigLocation; + private static URL jarEOrigLocation; + private static URL localAllContext; + private static URL hdfsAllContext; + private static URL jettyAllContext; + private static TestClassInfo classA; + private static TestClassInfo classB; + private static TestClassInfo classC; + private static TestClassInfo classD; + + @TempDir + private static java.nio.file.Path tempDir; + + @BeforeAll + public static void beforeAll() throws Exception { + String baseCacheDir = tempDir.resolve("base").toUri().toString(); + + ConfigurationCopy acuConf = + new ConfigurationCopy(Map.of(Constants.CACHE_DIR_PROPERTY, baseCacheDir)); + FACTORY.init(() -> new ConfigurationImpl(acuConf)); + + // Find the Test jar files + jarAOrigLocation = + LocalCachingContextClassLoaderFactoryTest.class.getResource("/ClassLoaderTestA/TestA.jar"); + assertNotNull(jarAOrigLocation); + jarBOrigLocation = + LocalCachingContextClassLoaderFactoryTest.class.getResource("/ClassLoaderTestB/TestB.jar"); + assertNotNull(jarBOrigLocation); + jarCOrigLocation = + LocalCachingContextClassLoaderFactoryTest.class.getResource("/ClassLoaderTestC/TestC.jar"); + assertNotNull(jarCOrigLocation); + jarDOrigLocation = + LocalCachingContextClassLoaderFactoryTest.class.getResource("/ClassLoaderTestD/TestD.jar"); + assertNotNull(jarDOrigLocation); + jarEOrigLocation = + LocalCachingContextClassLoaderFactoryTest.class.getResource("/ClassLoaderTestE/TestE.jar"); + assertNotNull(jarEOrigLocation); + + // Put B into HDFS + hdfs = TestUtils.getMiniCluster(); + URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory(hdfs.getConfiguration(0))); + + fs = hdfs.getFileSystem(); + assertTrue(fs.mkdirs(new Path("/contextB"))); + final Path dst = new Path("/contextB/TestB.jar"); + fs.copyFromLocalFile(new Path(jarBOrigLocation.toURI()), dst); + assertTrue(fs.exists(dst)); + final URL jarBHdfsLocation = new URL(fs.getUri().toString() + dst.toUri().toString()); + + // Have Jetty serve up files from Jar C directory + java.nio.file.Path jarCParentDirectory = + java.nio.file.Path.of(jarCOrigLocation.toURI()).getParent(); + assertNotNull(jarCParentDirectory); + jetty = TestUtils.getJetty(jarCParentDirectory); + final URL jarCJettyLocation = jetty.getURI().resolve("TestC.jar").toURL(); + + // ContextDefinition with all jars + ContextDefinition allJarsDef = ContextDefinition.create("all", MONITOR_INTERVAL_SECS, + jarAOrigLocation, jarBHdfsLocation, jarCJettyLocation, jarDOrigLocation); + String allJarsDefJson = allJarsDef.toJson(); + + // Create local context definition in jar C directory + File localDefFile = jarCParentDirectory.resolve("allContextDefinition.json").toFile(); + Files.writeString(localDefFile.toPath(), allJarsDefJson, StandardOpenOption.CREATE); + assertTrue(Files.exists(localDefFile.toPath())); + + Path hdfsDefFile = new Path("/allContextDefinition.json"); + fs.copyFromLocalFile(new Path(localDefFile.toURI()), hdfsDefFile); + assertTrue(fs.exists(hdfsDefFile)); + + localAllContext = localDefFile.toURI().toURL(); + hdfsAllContext = new URL(fs.getUri().toString() + hdfsDefFile.toUri().toString()); + jettyAllContext = jetty.getURI().resolve("allContextDefinition.json").toURL(); + + classA = new TestClassInfo("test.TestObjectA", "Hello from A"); + classB = new TestClassInfo("test.TestObjectB", "Hello from B"); + classC = new TestClassInfo("test.TestObjectC", "Hello from C"); + classD = new TestClassInfo("test.TestObjectD", "Hello from D"); + } + + @AfterAll + public static void afterAll() throws Exception { + if (jetty != null) { + jetty.stop(); + jetty.join(); + } + if (hdfs != null) { + hdfs.shutdown(); + } + } + + @AfterEach + public void afterEach() { + FACTORY.resetForTests(); + } + + @Test + public void testCreateFromLocal() throws Exception { + final ClassLoader cl = FACTORY.getClassLoader(localAllContext.toString()); + testClassLoads(cl, classA); + testClassLoads(cl, classB); + testClassLoads(cl, classC); + testClassLoads(cl, classD); + } + + @Test + public void testCreateFromHdfs() throws Exception { + final ClassLoader cl = FACTORY.getClassLoader(hdfsAllContext.toString()); + testClassLoads(cl, classA); + testClassLoads(cl, classB); + testClassLoads(cl, classC); + testClassLoads(cl, classD); + } + + @Test + public void testCreateFromHttp() throws Exception { + final ClassLoader cl = FACTORY.getClassLoader(jettyAllContext.toString()); + testClassLoads(cl, classA); + testClassLoads(cl, classB); + testClassLoads(cl, classC); + testClassLoads(cl, classD); + } + + @Test + public void testInvalidContextDefinitionURL() { + ContextClassLoaderException ex = + assertThrows(ContextClassLoaderException.class, () -> FACTORY.getClassLoader("/not/a/URL")); + assertEquals("Error getting classloader for context: Expected valid URL to context definition " + + "file but received: /not/a/URL", ex.getMessage()); + } + + @Test + public void testInitialContextDefinitionEmpty() throws Exception { + // Create a new context definition file in HDFS, but with no content + final Path def = createContextDefinitionFile(fs, "EmptyContextDefinitionFile.json", null); + final URL emptyDefUrl = new URL(fs.getUri().toString() + def.toUri().toString()); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> FACTORY.getClassLoader(emptyDefUrl.toString())); + assertEquals( + "Error getting classloader for context: ContextDefinition null for context definition " + + "file: " + emptyDefUrl.toString(), + ex.getMessage()); + } + + @Test + public void testInitialInvalidJson() throws Exception { + // Create a new context definition file in HDFS, but with invalid content + ContextDefinition def = + ContextDefinition.create("invalid", MONITOR_INTERVAL_SECS, jarAOrigLocation); + // write out invalid json + final Path invalid = createContextDefinitionFile(fs, "InvalidContextDefinitionFile.json", + def.toJson().substring(0, 4)); + final URL invalidDefUrl = new URL(fs.getUri().toString() + invalid.toUri().toString()); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> FACTORY.getClassLoader(invalidDefUrl.toString())); + assertTrue(ex.getMessage().startsWith( + "Error getting classloader for context: com.google.gson.stream.MalformedJsonException")); + } + + @Test + public void testInitial() throws Exception { + ContextDefinition def = + ContextDefinition.create("initial", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path initial = + createContextDefinitionFile(fs, "InitialContextDefinitionFile.json", def.toJson()); + final URL initialDefUrl = new URL(fs.getUri().toString() + initial.toUri().toString()); + + ClassLoader cl = FACTORY.getClassLoader(initialDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + } + + @Test + public void testInitialNonExistentResource() throws Exception { + // copy jarA to some other name + java.nio.file.Path jarAPath = java.nio.file.Path.of(jarAOrigLocation.toURI()); + java.nio.file.Path jarAPathParent = jarAPath.getParent(); + assertNotNull(jarAPathParent); + java.nio.file.Path jarACopy = jarAPathParent.resolve("jarACopy.jar"); + assertTrue(!Files.exists(jarACopy)); + Files.copy(jarAPath, jarACopy, StandardCopyOption.REPLACE_EXISTING); + assertTrue(Files.exists(jarACopy)); + + ContextDefinition def = + ContextDefinition.create("initial", MONITOR_INTERVAL_SECS, jarACopy.toUri().toURL()); + + Files.delete(jarACopy); + assertTrue(!Files.exists(jarACopy)); + + final Path initial = createContextDefinitionFile(fs, + "InitialContextDefinitionFileMissingResource.json", def.toJson()); + final URL initialDefUrl = new URL(fs.getUri().toString() + initial.toUri().toString()); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> FACTORY.getClassLoader(initialDefUrl.toString())); + assertTrue(ex.getMessage().endsWith("jarACopy.jar does not exist.")); + } + + @Test + public void testInitialBadResourceURL() throws Exception { + Resource r = new Resource(); + // remove the file:// prefix from the URL + r.setLocation(jarAOrigLocation.toString().substring(6)); + r.setChecksum("1234"); + TreeSet resources = new TreeSet<>(); + resources.add(r); + + ContextDefinition def = new ContextDefinition(); + def.setContextName("initial"); + def.setMonitorIntervalSeconds(MONITOR_INTERVAL_SECS); + def.setResources(resources); + + final Path initial = createContextDefinitionFile(fs, + "InitialContextDefinitionBadResourceURL.json", def.toJson()); + final URL initialDefUrl = new URL(fs.getUri().toString() + initial.toUri().toString()); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> FACTORY.getClassLoader(initialDefUrl.toString())); + assertTrue(ex.getMessage().startsWith("Error getting classloader for context: no protocol")); + Throwable t = ex.getCause(); + assertTrue(t instanceof MalformedURLException); + assertTrue(t.getMessage().startsWith("no protocol")); + } + + @Test + public void testInitialBadResourceChecksum() throws Exception { + Resource r = new Resource(); + r.setLocation(jarAOrigLocation.toString()); + r.setChecksum("1234"); + TreeSet resources = new TreeSet<>(); + resources.add(r); + + ContextDefinition def = new ContextDefinition(); + def.setContextName("initial"); + def.setMonitorIntervalSeconds(MONITOR_INTERVAL_SECS); + def.setResources(resources); + + final Path initial = createContextDefinitionFile(fs, + "InitialContextDefinitionBadResourceChecksum.json", def.toJson()); + final URL initialDefUrl = new URL(fs.getUri().toString() + initial.toUri().toString()); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> FACTORY.getClassLoader(initialDefUrl.toString())); + assertTrue(ex.getMessage().startsWith("Error getting classloader for context: Checksum")); + Throwable t = ex.getCause(); + assertTrue(t instanceof IllegalStateException); + assertTrue( + t.getMessage().endsWith("TestA.jar does not match checksum in context definition 1234")); + } + + @Test + public void testUpdate() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateContextDefinitionFile.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + // Update the contents of the context definition json file + ContextDefinition updateDef = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarDOrigLocation); + updateContextDefinitionFile(fs, defFilePath, updateDef.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + assertNotEquals(cl, cl2); + + testClassFailsToLoad(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassLoads(cl2, classD); + } + + @Test + public void testUpdateSameClassNameDifferentContent() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateContextDefinitionFile.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + // Update the contents of the context definition json file + ContextDefinition updateDef = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarEOrigLocation); + updateContextDefinitionFile(fs, defFilePath, updateDef.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + assertNotEquals(cl, cl2); + + @SuppressWarnings("unchecked") + Class clazz = + (Class) cl2.loadClass(classA.getClassName()); + test.Test impl = clazz.getDeclaredConstructor().newInstance(); + assertEquals("Hello from E", impl.hello()); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + } + + @Test + public void testUpdateContextDefinitionEmpty() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateEmptyContextDefinitionFile.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + // Update the contents of the context definition json file with an empty file + updateContextDefinitionFile(fs, defFilePath, null); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + + } + + @Test + public void testUpdateNonExistentResource() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateNonExistentResource.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + // copy jarA to jarACopy + // create a ContextDefinition that references it + // delete jarACopy + java.nio.file.Path jarAPath = java.nio.file.Path.of(jarAOrigLocation.toURI()); + java.nio.file.Path jarAPathParent = jarAPath.getParent(); + assertNotNull(jarAPathParent); + java.nio.file.Path jarACopy = jarAPathParent.resolve("jarACopy.jar"); + assertTrue(!Files.exists(jarACopy)); + Files.copy(jarAPath, jarACopy, StandardCopyOption.REPLACE_EXISTING); + assertTrue(Files.exists(jarACopy)); + ContextDefinition def2 = + ContextDefinition.create("initial", MONITOR_INTERVAL_SECS, jarACopy.toUri().toURL()); + Files.delete(jarACopy); + assertTrue(!Files.exists(jarACopy)); + + updateContextDefinitionFile(fs, defFilePath, def2.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + } + + @Test + public void testUpdateBadResourceChecksum() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateBadResourceChecksum.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + Resource r = new Resource(); + r.setLocation(jarAOrigLocation.toString()); + r.setChecksum("1234"); + TreeSet resources = new TreeSet<>(); + resources.add(r); + + ContextDefinition def2 = new ContextDefinition(); + def2.setContextName("update"); + def2.setMonitorIntervalSeconds(MONITOR_INTERVAL_SECS); + def2.setResources(resources); + + updateContextDefinitionFile(fs, defFilePath, def2.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + } + + @Test + public void testUpdateBadResourceURL() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateBadResourceChecksum.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + Resource r = new Resource(); + // remove the file:// prefix from the URL + r.setLocation(jarAOrigLocation.toString().substring(6)); + r.setChecksum("1234"); + TreeSet resources = new TreeSet<>(); + resources.add(r); + + ContextDefinition def2 = new ContextDefinition(); + def2.setContextName("initial"); + def2.setMonitorIntervalSeconds(MONITOR_INTERVAL_SECS); + def2.setResources(resources); + + updateContextDefinitionFile(fs, defFilePath, def2.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + } + + @Test + public void testUpdateInvalidJson() throws Exception { + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateInvalidContextDefinitionFile.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + ContextDefinition updateDef = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarDOrigLocation); + updateContextDefinitionFile(fs, defFilePath, updateDef.toJson().substring(0, 4)); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = FACTORY.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + + // Re-write the updated context definition such that it is now valid + updateContextDefinitionFile(fs, defFilePath, updateDef.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl3 = FACTORY.getClassLoader(updateDefUrl.toString()); + + assertEquals(cl, cl2); + assertNotEquals(cl, cl3); + testClassFailsToLoad(cl3, classA); + testClassFailsToLoad(cl3, classB); + testClassFailsToLoad(cl3, classC); + testClassLoads(cl3, classD); + } + + @Test + public void testChangingContext() throws Exception { + ContextDefinition def = ContextDefinition.create("update", MONITOR_INTERVAL_SECS, + jarAOrigLocation, jarBOrigLocation, jarCOrigLocation, jarDOrigLocation); + final Path update = + createContextDefinitionFile(fs, "UpdateChangingContextDefinition.json", def.toJson()); + final URL updatedDefUrl = new URL(fs.getUri().toString() + update.toUri().toString()); + + final ClassLoader cl = FACTORY.getClassLoader(updatedDefUrl.toString()); + testClassLoads(cl, classA); + testClassLoads(cl, classB); + testClassLoads(cl, classC); + testClassLoads(cl, classD); + + final List masterList = new ArrayList<>(); + masterList.add(jarAOrigLocation); + masterList.add(jarBOrigLocation); + masterList.add(jarCOrigLocation); + masterList.add(jarDOrigLocation); + + List priorList = masterList; + ClassLoader priorCL = cl; + + for (int i = 0; i < 20; i++) { + final List updatedList = new ArrayList<>(masterList); + Collections.shuffle(updatedList); + final URL removed = updatedList.remove(0); + + // Update the contents of the context definition json file + ContextDefinition updateDef = ContextDefinition.create("update", MONITOR_INTERVAL_SECS, + updatedList.toArray(new URL[] {})); + updateContextDefinitionFile(fs, update, updateDef.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader updatedClassLoader = FACTORY.getClassLoader(updatedDefUrl.toString()); + + if (updatedList.equals(priorList)) { + assertEquals(priorCL, updatedClassLoader); + } else { + assertNotEquals(cl, updatedClassLoader); + for (URL u : updatedList) { + if (u.toString().equals(jarAOrigLocation.toString())) { + testClassLoads(updatedClassLoader, classA); + } else if (u.toString().equals(jarBOrigLocation.toString())) { + testClassLoads(updatedClassLoader, classB); + } else if (u.toString().equals(jarCOrigLocation.toString())) { + testClassLoads(updatedClassLoader, classC); + } else if (u.toString().equals(jarDOrigLocation.toString())) { + testClassLoads(updatedClassLoader, classD); + } else { + fail("Unexpected url: " + u.toString()); + } + } + } + if (removed.toString().equals(jarAOrigLocation.toString())) { + testClassFailsToLoad(updatedClassLoader, classA); + } else if (removed.toString().equals(jarBOrigLocation.toString())) { + testClassFailsToLoad(updatedClassLoader, classB); + } else if (removed.toString().equals(jarCOrigLocation.toString())) { + testClassFailsToLoad(updatedClassLoader, classC); + } else if (removed.toString().equals(jarDOrigLocation.toString())) { + testClassFailsToLoad(updatedClassLoader, classD); + } else { + fail("Unexpected url: " + removed.toString()); + } + priorCL = updatedClassLoader; + priorList = updatedList; + } + } + + @Test + public void testGracePeriod() throws Exception { + final LocalCachingContextClassLoaderFactory localFactory = + new LocalCachingContextClassLoaderFactory(); + + String baseCacheDir = tempDir.resolve("base").toUri().toString(); + ConfigurationCopy acuConf = new ConfigurationCopy(Map.of(Constants.CACHE_DIR_PROPERTY, + baseCacheDir, Constants.UPDATE_FAILURE_GRACE_PERIOD_MINS, "1")); + localFactory.init(() -> new ConfigurationImpl(acuConf)); + + final ContextDefinition def = + ContextDefinition.create("update", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final Path defFilePath = + createContextDefinitionFile(fs, "UpdateNonExistentResource.json", def.toJson()); + final URL updateDefUrl = new URL(fs.getUri().toString() + defFilePath.toUri().toString()); + + final ClassLoader cl = localFactory.getClassLoader(updateDefUrl.toString()); + + testClassLoads(cl, classA); + testClassFailsToLoad(cl, classB); + testClassFailsToLoad(cl, classC); + testClassFailsToLoad(cl, classD); + + // copy jarA to jarACopy + // create a ContextDefinition that references it + // delete jarACopy + java.nio.file.Path jarAPath = java.nio.file.Path.of(jarAOrigLocation.toURI()); + java.nio.file.Path jarAPathParent = jarAPath.getParent(); + assertNotNull(jarAPathParent); + java.nio.file.Path jarACopy = jarAPathParent.resolve("jarACopy.jar"); + assertTrue(!Files.exists(jarACopy)); + Files.copy(jarAPath, jarACopy, StandardCopyOption.REPLACE_EXISTING); + assertTrue(Files.exists(jarACopy)); + ContextDefinition def2 = + ContextDefinition.create("initial", MONITOR_INTERVAL_SECS, jarACopy.toUri().toURL()); + Files.delete(jarACopy); + assertTrue(!Files.exists(jarACopy)); + + updateContextDefinitionFile(fs, defFilePath, def2.toJson()); + + // wait 2x the monitor interval + Thread.sleep(MONITOR_INTERVAL_SECS * 2 * 1000); + + final ClassLoader cl2 = localFactory.getClassLoader(updateDefUrl.toString()); + + // validate that the classloader has not updated + assertEquals(cl, cl2); + testClassLoads(cl2, classA); + testClassFailsToLoad(cl2, classB); + testClassFailsToLoad(cl2, classC); + testClassFailsToLoad(cl2, classD); + + // Wait 2 minutes for grace period to expire + Thread.sleep(120_000); + + ContextClassLoaderException ex = assertThrows(ContextClassLoaderException.class, + () -> localFactory.getClassLoader(updateDefUrl.toString())); + assertTrue(ex.getMessage().endsWith("jarACopy.jar does not exist.")); + + } + +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextTest.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextTest.java new file mode 100644 index 0000000..b79f516 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/LocalCachingContextTest.java @@ -0,0 +1,195 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static org.apache.accumulo.classloader.lcc.TestUtils.testClassFailsToLoad; +import static org.apache.accumulo.classloader.lcc.TestUtils.testClassLoads; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNotEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.net.URL; +import java.nio.file.Files; +import java.util.TreeSet; + +import org.apache.accumulo.classloader.lcc.TestUtils.TestClassInfo; +import org.apache.accumulo.classloader.lcc.definition.ContextDefinition; +import org.apache.accumulo.classloader.lcc.definition.Resource; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FsUrlStreamHandlerFactory; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.hdfs.MiniDFSCluster; +import org.eclipse.jetty.server.Server; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.io.TempDir; + +public class LocalCachingContextTest { + + private static final String CONTEXT_NAME = "TEST_CONTEXT"; + private static final int MONITOR_INTERVAL_SECS = 5; + private static MiniDFSCluster hdfs; + private static Server jetty; + private static ContextDefinition def; + private static TestClassInfo classA; + private static TestClassInfo classB; + private static TestClassInfo classC; + private static TestClassInfo classD; + private static String baseCacheDir; + + @TempDir + private static java.nio.file.Path tempDir; + + @BeforeAll + public static void beforeAll() throws Exception { + baseCacheDir = tempDir.resolve("base").toUri().toString(); + + // Find the Test jar files + final URL jarAOrigLocation = + LocalCachingContextTest.class.getResource("/ClassLoaderTestA/TestA.jar"); + assertNotNull(jarAOrigLocation); + final URL jarBOrigLocation = + LocalCachingContextTest.class.getResource("/ClassLoaderTestB/TestB.jar"); + assertNotNull(jarBOrigLocation); + final URL jarCOrigLocation = + LocalCachingContextTest.class.getResource("/ClassLoaderTestC/TestC.jar"); + assertNotNull(jarCOrigLocation); + + // Put B into HDFS + hdfs = TestUtils.getMiniCluster(); + final FileSystem fs = hdfs.getFileSystem(); + assertTrue(fs.mkdirs(new Path("/contextB"))); + final Path dst = new Path("/contextB/TestB.jar"); + fs.copyFromLocalFile(new Path(jarBOrigLocation.toURI()), dst); + assertTrue(fs.exists(dst)); + URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory(hdfs.getConfiguration(0))); + final URL jarBNewLocation = new URL(fs.getUri().toString() + dst.toUri().toString()); + + // Put C into Jetty + java.nio.file.Path jarCParentDirectory = + java.nio.file.Path.of(jarCOrigLocation.toURI()).getParent(); + jetty = TestUtils.getJetty(jarCParentDirectory); + final URL jarCNewLocation = jetty.getURI().resolve("TestC.jar").toURL(); + + // Create ContextDefinition with all three resources + final TreeSet resources = new TreeSet<>(); + resources.add(new Resource(jarAOrigLocation.toString(), + TestUtils.computeResourceChecksum(jarAOrigLocation))); + resources.add(new Resource(jarBNewLocation.toString(), + TestUtils.computeResourceChecksum(jarBOrigLocation))); + resources.add(new Resource(jarCNewLocation.toString(), + TestUtils.computeResourceChecksum(jarCOrigLocation))); + + def = new ContextDefinition(CONTEXT_NAME, MONITOR_INTERVAL_SECS, resources); + classA = new TestClassInfo("test.TestObjectA", "Hello from A"); + classB = new TestClassInfo("test.TestObjectB", "Hello from B"); + classC = new TestClassInfo("test.TestObjectC", "Hello from C"); + classD = new TestClassInfo("test.TestObjectD", "Hello from D"); + } + + @AfterAll + public static void afterAll() throws Exception { + if (jetty != null) { + jetty.stop(); + jetty.join(); + } + if (hdfs != null) { + hdfs.shutdown(); + } + } + + @Test + public void testInitialize() throws Exception { + LocalCachingContext lcccl = new LocalCachingContext(baseCacheDir, def); + lcccl.initialize(); + + // Confirm the 3 jars are cached locally + final java.nio.file.Path base = java.nio.file.Path.of(tempDir.resolve("base").toUri()); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(base.resolve(CONTEXT_NAME))); + for (Resource r : def.getResources()) { + String filename = TestUtils.getFileName(r.getURL()); + String checksum = r.getChecksum(); + assertTrue(Files.exists(base.resolve(CONTEXT_NAME).resolve(filename + "_" + checksum))); + } + } + + @Test + public void testClassLoader() throws Exception { + + LocalCachingContext lcccl = new LocalCachingContext(baseCacheDir, def); + lcccl.initialize(); + ClassLoader contextClassLoader = lcccl.getClassloader(); + + testClassLoads(contextClassLoader, classA); + testClassLoads(contextClassLoader, classB); + testClassLoads(contextClassLoader, classC); + } + + @Test + public void testUpdate() throws Exception { + + LocalCachingContext lcccl = new LocalCachingContext(baseCacheDir, def); + lcccl.initialize(); + + final ClassLoader contextClassLoader = lcccl.getClassloader(); + + testClassLoads(contextClassLoader, classA); + testClassLoads(contextClassLoader, classB); + testClassLoads(contextClassLoader, classC); + + TreeSet updatedResources = new TreeSet<>(def.getResources()); + assertEquals(3, updatedResources.size()); + updatedResources.remove(updatedResources.last()); // remove C + + // Add D + final URL jarDOrigLocation = + LocalCachingContextTest.class.getResource("/ClassLoaderTestD/TestD.jar"); + assertNotNull(jarDOrigLocation); + updatedResources.add(new Resource(jarDOrigLocation.toString(), + TestUtils.computeResourceChecksum(jarDOrigLocation))); + + ContextDefinition updatedDef = + new ContextDefinition(CONTEXT_NAME, MONITOR_INTERVAL_SECS, updatedResources); + lcccl.update(updatedDef); + + // Confirm the 3 jars are cached locally + final java.nio.file.Path base = java.nio.file.Path.of(tempDir.resolve("base").toUri()); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(base.resolve(CONTEXT_NAME))); + for (Resource r : updatedDef.getResources()) { + String filename = TestUtils.getFileName(r.getURL()); + assertFalse(filename.contains("C")); + String checksum = r.getChecksum(); + assertTrue(Files.exists(base.resolve(CONTEXT_NAME).resolve(filename + "_" + checksum))); + } + + final ClassLoader updatedContextClassLoader = lcccl.getClassloader(); + + assertNotEquals(contextClassLoader, updatedContextClassLoader); + testClassLoads(updatedContextClassLoader, classA); + testClassLoads(updatedContextClassLoader, classB); + testClassFailsToLoad(updatedContextClassLoader, classC); + testClassLoads(updatedContextClassLoader, classD); + } + +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/MiniAccumuloClusterClassLoaderFactoryTest.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/MiniAccumuloClusterClassLoaderFactoryTest.java new file mode 100644 index 0000000..6e6fed2 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/MiniAccumuloClusterClassLoaderFactoryTest.java @@ -0,0 +1,288 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static java.nio.charset.StandardCharsets.UTF_8; +import static java.nio.file.attribute.PosixFilePermission.OWNER_EXECUTE; +import static java.nio.file.attribute.PosixFilePermission.OWNER_READ; +import static java.nio.file.attribute.PosixFilePermission.OWNER_WRITE; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertIterableEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.io.File; +import java.net.URL; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.StandardCopyOption; +import java.nio.file.StandardOpenOption; +import java.nio.file.attribute.FileAttribute; +import java.nio.file.attribute.PosixFilePermission; +import java.nio.file.attribute.PosixFilePermissions; +import java.util.Arrays; +import java.util.Collections; +import java.util.EnumSet; +import java.util.List; +import java.util.Map.Entry; +import java.util.Set; +import java.util.TreeSet; +import java.util.stream.Collectors; + +import org.apache.accumulo.classloader.lcc.definition.ContextDefinition; +import org.apache.accumulo.core.client.Accumulo; +import org.apache.accumulo.core.client.AccumuloClient; +import org.apache.accumulo.core.client.AccumuloException; +import org.apache.accumulo.core.client.AccumuloSecurityException; +import org.apache.accumulo.core.client.IteratorSetting; +import org.apache.accumulo.core.client.Scanner; +import org.apache.accumulo.core.client.TableNotFoundException; +import org.apache.accumulo.core.clientImpl.AccumuloServerException; +import org.apache.accumulo.core.clientImpl.ClientContext; +import org.apache.accumulo.core.conf.Property; +import org.apache.accumulo.core.data.Key; +import org.apache.accumulo.core.data.TableId; +import org.apache.accumulo.core.data.Value; +import org.apache.accumulo.core.iterators.IteratorUtil.IteratorScope; +import org.apache.accumulo.core.metadata.schema.Ample; +import org.apache.accumulo.core.metadata.schema.TabletMetadata; +import org.apache.accumulo.core.metadata.schema.TabletsMetadata; +import org.apache.accumulo.harness.MiniClusterConfigurationCallback; +import org.apache.accumulo.harness.SharedMiniClusterBase; +import org.apache.accumulo.miniclusterImpl.MiniAccumuloConfigImpl; +import org.apache.accumulo.test.TestIngest; +import org.apache.accumulo.test.TestIngest.IngestParams; +import org.apache.accumulo.test.VerifyIngest; +import org.apache.accumulo.test.VerifyIngest.VerifyParams; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.io.TempDir; + +public class MiniAccumuloClusterClassLoaderFactoryTest extends SharedMiniClusterBase { + + private static class TestMACConfiguration implements MiniClusterConfigurationCallback { + + @Override + public void configureMiniCluster(MiniAccumuloConfigImpl cfg, + org.apache.hadoop.conf.Configuration coreSite) { + cfg.setNumTservers(3); + cfg.setProperty(Property.TSERV_NATIVEMAP_ENABLED.getKey(), "false"); + cfg.setProperty(Property.GENERAL_CONTEXT_CLASSLOADER_FACTORY.getKey(), + LocalCachingContextClassLoaderFactory.class.getName()); + cfg.setProperty(Constants.CACHE_DIR_PROPERTY, tempDir.resolve("base").toUri().toString()); + cfg.setProperty(Constants.UPDATE_FAILURE_GRACE_PERIOD_MINS, "1"); + } + } + + @TempDir + private static java.nio.file.Path tempDir; + + private static final Set CACHE_DIR_PERMS = + EnumSet.of(OWNER_READ, OWNER_WRITE, OWNER_EXECUTE); + private static final FileAttribute> PERMISSIONS = + PosixFilePermissions.asFileAttribute(CACHE_DIR_PERMS); + private static final String ITER_CLASS_NAME = + "org.apache.accumulo.classloader.vfs.examples.ExampleIterator"; + private static final int MONITOR_INTERVAL_SECS = + LocalCachingContextClassLoaderFactoryTest.MONITOR_INTERVAL_SECS; + + private static URL jarAOrigLocation; + private static URL jarBOrigLocation; + + @BeforeAll + public static void beforeAll() throws Exception { + + // Find the Test jar files + jarAOrigLocation = MiniAccumuloClusterClassLoaderFactoryTest.class + .getResource("/ExampleIteratorsA/example-iterators-a.jar"); + assertNotNull(jarAOrigLocation); + jarBOrigLocation = MiniAccumuloClusterClassLoaderFactoryTest.class + .getResource("/ExampleIteratorsB/example-iterators-b.jar"); + assertNotNull(jarBOrigLocation); + + startMiniClusterWithConfig(new TestMACConfiguration()); + } + + @AfterAll + public static void afterAll() throws Exception { + stopMiniCluster(); + } + + @Test + public void testClassLoader() throws Exception { + + Path baseDirPath = tempDir.resolve("base"); + Path jsonDirPath = baseDirPath.resolve("contextFiles"); + Files.createDirectory(jsonDirPath, PERMISSIONS); + + // Create a context definition that only references jar A + final ContextDefinition testContextDef = + ContextDefinition.create("test", MONITOR_INTERVAL_SECS, jarAOrigLocation); + final String testContextDefJson = testContextDef.toJson(); + final File testContextDefFile = jsonDirPath.resolve("testContextDefinition.json").toFile(); + Files.writeString(testContextDefFile.toPath(), testContextDefJson, StandardOpenOption.CREATE); + assertTrue(Files.exists(testContextDefFile.toPath())); + + final String[] names = this.getUniqueNames(1); + try (AccumuloClient client = + Accumulo.newClient().from(getCluster().getClientProperties()).build()) { + + List tservers = client.instanceOperations().getTabletServers(); + Collections.sort(tservers); + assertEquals(3, tservers.size()); + + final String tableName = names[0]; + + final IngestParams params = new IngestParams(client.properties(), tableName, 100); + params.cols = 10; + params.dataSize = 10; + params.startRow = 0; + params.columnFamily = "test"; + params.createTable = true; + params.numsplits = 3; + params.flushAfterRows = 0; + + TestIngest.createTable(client, params); + + // Confirm 4 tablets, spread across 3 tablet servers + client.instanceOperations().waitForBalance(); + + final List tm = getLocations(((ClientContext) client).getAmple(), + client.tableOperations().tableIdMap().get(tableName)); + assertEquals(4, tm.size()); // 3 tablets + + final Set tabletLocations = new TreeSet<>(); + tm.forEach(t -> tabletLocations.add(t.getLocation().getHostPort())); + assertEquals(3, tabletLocations.size()); // 3 locations + + // both collections are sorted + assertIterableEquals(tservers, tabletLocations); + + TestIngest.ingest(client, params); + + final VerifyParams vp = new VerifyParams(client.properties(), tableName, params.rows); + vp.cols = params.cols; + vp.rows = params.rows; + vp.dataSize = params.dataSize; + vp.startRow = params.startRow; + vp.columnFamily = params.columnFamily; + vp.cols = params.cols; + VerifyIngest.verifyIngest(client, vp); + + // Set the table classloader context. Context name is the URL to the context + // definition file + final String contextURL = testContextDefFile.toURI().toURL().toString(); + client.tableOperations().setProperty(tableName, Property.TABLE_CLASSLOADER_CONTEXT.getKey(), + contextURL); + + // check that the table is returning unique values + // before applying the iterator + final byte[] jarAValueBytes = "foo".getBytes(UTF_8); + assertEquals(0, countExpectedValues(client, tableName, jarAValueBytes)); + + // Attach a scan iterator to the table + IteratorSetting is = new IteratorSetting(101, "example", ITER_CLASS_NAME); + client.tableOperations().attachIterator(tableName, is, EnumSet.of(IteratorScope.scan)); + + // confirm that all values get transformed to "foo" + // by the iterator + int count = 0; + while (count != 1000) { + count = countExpectedValues(client, tableName, jarAValueBytes); + } + + // Update the context definition to point to jar B + final ContextDefinition testContextDefUpdate = + ContextDefinition.create("test", MONITOR_INTERVAL_SECS, jarBOrigLocation); + final String testContextDefUpdateJson = testContextDefUpdate.toJson(); + Files.writeString(testContextDefFile.toPath(), testContextDefUpdateJson, + StandardOpenOption.TRUNCATE_EXISTING); + assertTrue(Files.exists(testContextDefFile.toPath())); + + // Wait 2x the monitor interval + Thread.sleep(2 * MONITOR_INTERVAL_SECS * 1000); + + // Rescan with same iterator class name + // confirm that all values get transformed to "bar" + // by the iterator + final byte[] jarBValueBytes = "bar".getBytes(UTF_8); + assertEquals(1000, countExpectedValues(client, tableName, jarBValueBytes)); + + // Copy jar A, create a context definition using the copy, then + // remove the copy so that it's not found when the context classloader + // updates. + Path jarAPath = Path.of(jarAOrigLocation.toURI()); + Path jarAPathParent = jarAPath.getParent(); + assertNotNull(jarAPathParent); + Path jarACopy = jarAPathParent.resolve("jarACopy.jar"); + assertTrue(!Files.exists(jarACopy)); + Files.copy(jarAPath, jarACopy, StandardCopyOption.REPLACE_EXISTING); + assertTrue(Files.exists(jarACopy)); + + final ContextDefinition testContextDefUpdate2 = + ContextDefinition.create("test", MONITOR_INTERVAL_SECS, jarACopy.toUri().toURL()); + Files.delete(jarACopy); + assertTrue(!Files.exists(jarACopy)); + + final String testContextDefUpdateJson2 = testContextDefUpdate2.toJson(); + Files.writeString(testContextDefFile.toPath(), testContextDefUpdateJson2, + StandardOpenOption.TRUNCATE_EXISTING); + assertTrue(Files.exists(testContextDefFile.toPath())); + + // Wait 2x the monitor interval + Thread.sleep(2 * MONITOR_INTERVAL_SECS * 1000); + + // Rescan and confirm that all values get transformed to "bar" + // by the iterator. The previous class is still being used after + // the monitor interval because the jar referenced does not exist. + assertEquals(1000, countExpectedValues(client, tableName, jarBValueBytes)); + + // Wait 2 minutes, 2 times the UPDATE_FAILURE_GRACE_PERIOD_MINS + Thread.sleep(120_000); + + // Scan of table with iterator setting should now fail. + final Scanner scanner2 = client.createScanner(tableName); + RuntimeException re = + assertThrows(RuntimeException.class, () -> scanner2.iterator().hasNext()); + Throwable cause = re.getCause(); + assertTrue(cause instanceof AccumuloServerException); + } + } + + private int countExpectedValues(AccumuloClient client, String table, byte[] expectedValue) + throws TableNotFoundException, AccumuloSecurityException, AccumuloException { + Scanner scanner = client.createScanner(table); + int count = 0; + for (Entry e : scanner) { + if (Arrays.equals(e.getValue().get(), expectedValue)) { + count++; + } + } + return count; + } + + private static List getLocations(Ample ample, String tableId) { + try (TabletsMetadata tabletsMetadata = ample.readTablets().forTable(TableId.of(tableId)) + .fetch(TabletMetadata.ColumnType.LOCATION).build()) { + return tabletsMetadata.stream().collect(Collectors.toList()); + } + } +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/TestUtils.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/TestUtils.java new file mode 100644 index 0000000..65b7c3e --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/TestUtils.java @@ -0,0 +1,181 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc; + +import static java.nio.charset.StandardCharsets.UTF_8; +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.io.BufferedReader; +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.net.URL; + +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataOutputStream; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.hdfs.DFSConfigKeys; +import org.apache.hadoop.hdfs.MiniDFSCluster; +import org.eclipse.jetty.server.Server; +import org.eclipse.jetty.server.handler.ResourceHandler; +import org.eclipse.jetty.util.resource.PathResource; + +import edu.umd.cs.findbugs.annotations.SuppressFBWarnings; + +public class TestUtils { + + public static class TestClassInfo { + private final String className; + private final String helloOutput; + + public TestClassInfo(String className, String helloOutput) { + super(); + this.className = className; + this.helloOutput = helloOutput; + } + + public String getClassName() { + return className; + } + + public String getHelloOutput() { + return helloOutput; + } + } + + public static Path createContextDefinitionFile(FileSystem fs, String name, String contents) + throws Exception { + Path baseHdfsPath = new Path("/contextDefs"); + assertTrue(fs.mkdirs(baseHdfsPath)); + Path newContextDefinitionFile = new Path(baseHdfsPath, name); + + if (contents == null) { + assertTrue(fs.createNewFile(newContextDefinitionFile)); + } else { + try (FSDataOutputStream out = fs.create(newContextDefinitionFile)) { + out.writeBytes(contents); + } + } + assertTrue(fs.exists(newContextDefinitionFile)); + return newContextDefinitionFile; + } + + public static void updateContextDefinitionFile(FileSystem fs, Path defFilePath, String contents) + throws Exception { + // Update the contents of the context definition json file + assertTrue(fs.exists(defFilePath)); + fs.delete(defFilePath, false); + assertFalse(fs.exists(defFilePath)); + + if (contents == null) { + assertTrue(fs.createNewFile(defFilePath)); + } else { + try (FSDataOutputStream out = fs.create(defFilePath)) { + out.writeBytes(contents); + } + } + assertTrue(fs.exists(defFilePath)); + + } + + public static void testClassLoads(ClassLoader cl, TestClassInfo tci) throws Exception { + @SuppressWarnings("unchecked") + Class clazz = + (Class) cl.loadClass(tci.getClassName()); + test.Test impl = clazz.getDeclaredConstructor().newInstance(); + assertEquals(tci.getHelloOutput(), impl.hello()); + } + + public static void testClassFailsToLoad(ClassLoader cl, TestClassInfo tci) throws Exception { + assertThrows(ClassNotFoundException.class, () -> cl.loadClass(tci.getClassName())); + } + + private static String computeDatanodeDirectoryPermission() { + // MiniDFSCluster will check the permissions on the data directories, but does not + // do a good job of setting them properly. We need to get the users umask and set + // the appropriate Hadoop property so that the data directories will be created + // with the correct permissions. + try { + Process p = Runtime.getRuntime().exec("/bin/sh -c umask"); + try (BufferedReader bri = + new BufferedReader(new InputStreamReader(p.getInputStream(), UTF_8))) { + String line = bri.readLine(); + p.waitFor(); + + if (line == null) { + throw new IOException("umask input stream closed prematurely"); + } + short umask = Short.parseShort(line.trim(), 8); + // Need to set permission to 777 xor umask + // leading zero makes java interpret as base 8 + int newPermission = 0777 ^ umask; + + return String.format("%03o", newPermission); + } + } catch (Exception e) { + throw new RuntimeException("Error getting umask from O/S", e); + } + } + + public static MiniDFSCluster getMiniCluster() throws IOException { + System.setProperty("java.io.tmpdir", System.getProperty("user.dir") + "/target"); + + // Put the MiniDFSCluster directory in the target directory + System.setProperty("test.build.data", "target/build/test/data"); + + // Setup HDFS + Configuration conf = new Configuration(); + conf.set("hadoop.security.token.service.use_ip", "true"); + + conf.set("dfs.datanode.data.dir.perm", computeDatanodeDirectoryPermission()); + conf.setLong(DFSConfigKeys.DFS_BLOCK_SIZE_KEY, 1024 * 1024); // 1M blocksize + + MiniDFSCluster cluster = new MiniDFSCluster.Builder(conf).build(); + cluster.waitClusterUp(); + return cluster; + } + + public static Server getJetty(java.nio.file.Path resourceDirectory) throws Exception { + PathResource directory = new PathResource(resourceDirectory); + ResourceHandler handler = new ResourceHandler(); + handler.setBaseResource(directory); + + Server jetty = new Server(0); + jetty.setHandler(handler); + jetty.start(); + return jetty; + } + + @SuppressFBWarnings(value = "URLCONNECTION_SSRF_FD") + public static String computeResourceChecksum(URL resourceLocation) throws IOException { + try (InputStream is = resourceLocation.openStream()) { + return Constants.getChecksummer().digestAsHex(is); + } + } + + public static String getFileName(URL url) { + String path = url.getPath(); + return path.substring(path.lastIndexOf("/") + 1); + + } +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/cache/CacheUtilsTest.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/cache/CacheUtilsTest.java new file mode 100644 index 0000000..d928f4e --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/cache/CacheUtilsTest.java @@ -0,0 +1,147 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.cache; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertNull; +import static org.junit.jupiter.api.Assertions.assertThrows; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.nio.file.Files; +import java.nio.file.Path; +import java.util.concurrent.atomic.AtomicReference; + +import org.apache.accumulo.classloader.lcc.cache.CacheUtils.LockInfo; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.io.TempDir; + +public class CacheUtilsTest { + + @TempDir + private static Path tempDir; + + private static String baseCacheDir = null; + + @BeforeAll + public static void beforeAll() throws Exception { + baseCacheDir = tempDir.resolve("base").toUri().toString(); + } + + @Test + public void testPropertyNotSet() { + ContextClassLoaderException ex = + assertThrows(ContextClassLoaderException.class, () -> CacheUtils.createBaseCacheDir(null)); + assertEquals("Error getting classloader for context: received null for cache directory", + ex.getMessage()); + } + + @Test + public void testCreateBaseDir() throws Exception { + final Path base = Path.of(tempDir.resolve("base").toUri()); + try { + assertFalse(Files.exists(base)); + CacheUtils.createBaseCacheDir(baseCacheDir); + assertTrue(Files.exists(base)); + } finally { + Files.delete(base); + } + } + + @Test + public void testCreateBaseDirMultipleTimes() throws Exception { + final Path base = Path.of(tempDir.resolve("base").toUri()); + try { + assertFalse(Files.exists(base)); + CacheUtils.createBaseCacheDir(baseCacheDir); + CacheUtils.createBaseCacheDir(baseCacheDir); + CacheUtils.createBaseCacheDir(baseCacheDir); + CacheUtils.createBaseCacheDir(baseCacheDir); + assertTrue(Files.exists(base)); + } finally { + Files.delete(base); + } + } + + @Test + public void createOrGetContextCacheDir() throws Exception { + final Path base = Path.of(tempDir.resolve("base").toUri()); + try { + assertFalse(Files.exists(base)); + CacheUtils.createOrGetContextCacheDir(baseCacheDir, "context1"); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(base.resolve("context1"))); + CacheUtils.createOrGetContextCacheDir(baseCacheDir, "context2"); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(base.resolve("context2"))); + CacheUtils.createOrGetContextCacheDir(baseCacheDir, "context1"); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(base.resolve("context1"))); + } finally { + Files.delete(base.resolve("context1")); + Files.delete(base.resolve("context2")); + Files.delete(base); + } + } + + @Test + public void testLock() throws Exception { + final Path base = Path.of(tempDir.resolve("base").toUri()); + final Path cx1 = base.resolve("context1"); + try { + assertFalse(Files.exists(base)); + CacheUtils.createOrGetContextCacheDir(baseCacheDir, "context1"); + assertTrue(Files.exists(base)); + assertTrue(Files.exists(cx1)); + + final LockInfo lockInfo = CacheUtils.lockContextCacheDir(cx1); + try { + assertNotNull(lockInfo); + assertTrue(lockInfo.getLock().acquiredBy().equals(lockInfo.getChannel())); + assertFalse(lockInfo.getLock().isShared()); + assertTrue(lockInfo.getLock().isValid()); + + // Test that another thread can't get the lock + final AtomicReference error = new AtomicReference<>(); + final Thread t = new Thread(() -> { + try { + assertNull(CacheUtils.lockContextCacheDir(cx1)); + } catch (ContextClassLoaderException e) { + error.set(e); + } + }); + t.start(); + t.join(); + assertNull(error.get()); + + } finally { + lockInfo.unlock(); + } + } finally { + Files.delete(cx1.resolve("lock_file")); + Files.delete(cx1); + Files.delete(base); + } + + } + +} diff --git a/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolversTest.java b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolversTest.java new file mode 100644 index 0000000..09d4ba2 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/org/apache/accumulo/classloader/lcc/resolvers/FileResolversTest.java @@ -0,0 +1,127 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.apache.accumulo.classloader.lcc.resolvers; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import java.io.IOException; +import java.io.InputStream; +import java.net.URL; +import java.nio.file.Files; +import java.nio.file.StandardOpenOption; + +import org.apache.accumulo.classloader.lcc.TestUtils; +import org.apache.accumulo.core.spi.common.ContextClassLoaderFactory.ContextClassLoaderException; +import org.apache.commons.io.IOUtils; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FsUrlStreamHandlerFactory; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.hdfs.MiniDFSCluster; +import org.eclipse.jetty.server.Server; +import org.junit.jupiter.api.Test; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +public class FileResolversTest { + + private static final Logger LOG = LoggerFactory.getLogger(FileResolversTest.class); + + private long getFileSize(java.nio.file.Path p) throws IOException { + try (InputStream is = Files.newInputStream(p, StandardOpenOption.READ)) { + return IOUtils.consume(is); + } + } + + private long getFileSize(FileResolver resolver) throws IOException, ContextClassLoaderException { + try (InputStream is = resolver.getInputStream()) { + return IOUtils.consume(is); + } + } + + @Test + public void testLocalFile() throws Exception { + URL jarPath = FileResolversTest.class.getResource("/HelloWorld.jar"); + assertNotNull(jarPath); + java.nio.file.Path p = java.nio.file.Path.of(jarPath.toURI()); + final long origFileSize = getFileSize(p); + FileResolver resolver = FileResolver.resolve(jarPath); + assertTrue(resolver instanceof LocalFileResolver); + assertEquals(jarPath, resolver.getURL()); + assertEquals("HelloWorld.jar", resolver.getFileName()); + assertEquals(origFileSize, getFileSize(resolver)); + } + + @Test + public void testHttpFile() throws Exception { + + URL jarPath = FileResolversTest.class.getResource("/HelloWorld.jar"); + assertNotNull(jarPath); + java.nio.file.Path p = java.nio.file.Path.of(jarPath.toURI()); + final long origFileSize = getFileSize(p); + + Server jetty = TestUtils.getJetty(p.getParent()); + LOG.debug("Jetty listening at: {}", jetty.getURI()); + URL httpPath = jetty.getURI().resolve("HelloWorld.jar").toURL(); + FileResolver resolver = FileResolver.resolve(httpPath); + assertTrue(resolver instanceof HttpFileResolver); + assertEquals(httpPath, resolver.getURL()); + assertEquals("HelloWorld.jar", resolver.getFileName()); + assertEquals(origFileSize, getFileSize(resolver)); + + jetty.stop(); + jetty.join(); + } + + @Test + public void testHdfsFile() throws Exception { + + URL jarPath = FileResolversTest.class.getResource("/HelloWorld.jar"); + assertNotNull(jarPath); + java.nio.file.Path p = java.nio.file.Path.of(jarPath.toURI()); + final long origFileSize = getFileSize(p); + + MiniDFSCluster cluster = TestUtils.getMiniCluster(); + try { + FileSystem fs = cluster.getFileSystem(); + assertTrue(fs.mkdirs(new Path("/context1"))); + Path dst = new Path("/context1/HelloWorld.jar"); + fs.copyFromLocalFile(new Path(jarPath.toURI()), dst); + assertTrue(fs.exists(dst)); + + URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory(cluster.getConfiguration(0))); + + URL fullPath = new URL(fs.getUri().toString() + dst.toUri().toString()); + LOG.info("Path to hdfs file: {}", fullPath); + + FileResolver resolver = FileResolver.resolve(fullPath); + assertTrue(resolver instanceof HdfsFileResolver); + assertEquals(fullPath, resolver.getURL()); + assertEquals("HelloWorld.jar", resolver.getFileName()); + assertEquals(origFileSize, getFileSize(resolver)); + + } catch (IOException e) { + throw new RuntimeException("Error setting up mini cluster", e); + } finally { + cluster.shutdown(); + } + } + +} diff --git a/modules/local-caching-classloader/src/test/java/test/HelloWorldTemplate b/modules/local-caching-classloader/src/test/java/test/HelloWorldTemplate new file mode 100644 index 0000000..c6def69 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/test/HelloWorldTemplate @@ -0,0 +1,27 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package test; + +public class HelloWorld { + + @Override + public String toString() { + return "%%"; + } +} diff --git a/modules/local-caching-classloader/src/test/java/test/Test.java b/modules/local-caching-classloader/src/test/java/test/Test.java new file mode 100644 index 0000000..3c2b196 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/test/Test.java @@ -0,0 +1,27 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package test; + +public interface Test { + + String hello(); + + int add(); + +} diff --git a/modules/local-caching-classloader/src/test/java/test/TestTemplate b/modules/local-caching-classloader/src/test/java/test/TestTemplate new file mode 100644 index 0000000..aa10a02 --- /dev/null +++ b/modules/local-caching-classloader/src/test/java/test/TestTemplate @@ -0,0 +1,36 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one + * or more contributor license agreements. See the NOTICE file + * distributed with this work for additional information + * regarding copyright ownership. The ASF licenses this file + * to you under the Apache License, Version 2.0 (the + * "License"); you may not use this file except in compliance + * with the License. You may obtain a copy of the License at + * + * https://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package test; + +public class TestObjectXXX implements Test { + + int i = 0; + + @Override + public String hello() { + return "Hello from XXX"; + } + + @Override + public int add() { + i += 1; + return i; + } + +} diff --git a/modules/local-caching-classloader/src/test/resources/log4j2-test.properties b/modules/local-caching-classloader/src/test/resources/log4j2-test.properties new file mode 100644 index 0000000..46e83a6 --- /dev/null +++ b/modules/local-caching-classloader/src/test/resources/log4j2-test.properties @@ -0,0 +1,41 @@ +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# + +status = error +dest = err +name = LocalCachineClassLoaderTestLoggingProperties + +appender.console.type = Console +appender.console.name = STDOUT +appender.console.target = SYSTEM_OUT +appender.console.layout.type = PatternLayout +appender.console.layout.pattern = %d{ISO8601} [%-8c{2}] %-5p: %m%n + +logger.01.name = org.apache.accumulo +logger.01.level = debug + +logger.02.name = org.apache.accumulo.classloader.lcc +logger.02.level = trace + +logger.03.name = org.apache.hadoop +logger.03.level = error + +rootLogger.level = warn +rootLogger.appenderRef.console.ref = STDOUT + diff --git a/modules/local-caching-classloader/src/test/shell/makeHelloWorldJars.sh b/modules/local-caching-classloader/src/test/shell/makeHelloWorldJars.sh new file mode 100755 index 0000000..c22e26d --- /dev/null +++ b/modules/local-caching-classloader/src/test/shell/makeHelloWorldJars.sh @@ -0,0 +1,35 @@ +#! /usr/bin/env bash +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# + +if [[ -z $JAVA_HOME ]]; then + echo "JAVA_HOME is not set. Java is required to proceed" + exit 1 +fi +mkdir -p target/generated-sources/HelloWorld/test +sed "s/%%/Hello World\!/" target/generated-sources/HelloWorld/test/HelloWorld.java +"$JAVA_HOME/bin/javac" target/generated-sources/HelloWorld/test/HelloWorld.java -d target/generated-sources/HelloWorld +"$JAVA_HOME/bin/jar" -cf target/test-classes/HelloWorld.jar -C target/generated-sources/HelloWorld test/HelloWorld.class +rm -r target/generated-sources/HelloWorld/test + +mkdir -p target/generated-sources/HalloWelt/test +sed "s/%%/Hallo Welt/" target/generated-sources/HalloWelt/test/HelloWorld.java +"$JAVA_HOME/bin/javac" target/generated-sources/HalloWelt/test/HelloWorld.java -d target/generated-sources/HalloWelt +"$JAVA_HOME/bin/jar" -cf target/test-classes/HelloWorld2.jar -C target/generated-sources/HalloWelt test/HelloWorld.class +rm -r target/generated-sources/HalloWelt/test diff --git a/modules/local-caching-classloader/src/test/shell/makeTestJars.sh b/modules/local-caching-classloader/src/test/shell/makeTestJars.sh new file mode 100755 index 0000000..77a7fd0 --- /dev/null +++ b/modules/local-caching-classloader/src/test/shell/makeTestJars.sh @@ -0,0 +1,40 @@ +#! /usr/bin/env bash +# +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# + +if [[ -z $JAVA_HOME ]]; then + echo "JAVA_HOME is not set. Java is required to proceed" + exit 1 +fi + +for x in A B C D; do + mkdir -p target/generated-sources/$x/test target/test-classes/ClassLoaderTest$x + sed "s/XXX/$x/" target/generated-sources/$x/test/TestObject$x.java + "$JAVA_HOME/bin/javac" --release 11 -cp target/test-classes target/generated-sources/$x/test/TestObject$x.java -d target/generated-sources/$x + "$JAVA_HOME/bin/jar" -cf target/test-classes/ClassLoaderTest$x/Test$x.jar -C target/generated-sources/$x test/TestObject$x.class + rm -r target/generated-sources/$x +done + +# Create a one-off jar that uses the A class name, but returns the E content for the hello method +# this will be located in the ClassLoaderTestE directory +mkdir -p target/generated-sources/E/test target/test-classes/ClassLoaderTestE +sed "s/XXX/A/" target/generated-sources/E/test/TestObjectA.java +sed -i "s/Hello from A/Hello from E/" target/generated-sources/E/test/TestObjectA.java +"$JAVA_HOME/bin/javac" --release 11 -cp target/test-classes target/generated-sources/E/test/TestObjectA.java -d target/generated-sources/E +"$JAVA_HOME/bin/jar" -cf target/test-classes/ClassLoaderTestE/TestE.jar -C target/generated-sources/E test/TestObjectA.class diff --git a/pom.xml b/pom.xml index bd7b541..125cb8f 100644 --- a/pom.xml +++ b/pom.xml @@ -76,6 +76,7 @@ modules/example-iterators-a modules/example-iterators-b modules/vfs-class-loader + modules/local-caching-classloader scm:git:https://gitbox.apache.org/repos/asf/accumulo-classloaders.git