bandersnatch package¶
Package contents¶
Submodules¶
bandersnatch.configuration module¶
Module containing classes to access the bandersnatch configuration file
- class bandersnatch.configuration.BandersnatchConfig(*args: Any, **kwargs: Any)[source]¶
Bases:
object
- SHOWN_DEPRECATIONS = False¶
- class bandersnatch.configuration.SetConfigValues(json_save, root_uri, diff_file_path, diff_append_epoch, digest_name, storage_backend_name, cleanup, release_files_save, compare_method, download_mirror, download_mirror_no_fallback)[source]¶
Bases:
tuple
- bandersnatch.configuration.validate_config_values(config: configparser.ConfigParser) bandersnatch.configuration.SetConfigValues [source]¶
bandersnatch.delete module¶
- async bandersnatch.delete.delete_packages(config: configparser.ConfigParser, args: argparse.Namespace, master: bandersnatch.master.Master) int [source]¶
- async bandersnatch.delete.delete_path(blob_path: pathlib.Path, dry_run: bool = False) int [source]¶
bandersnatch.filter module¶
Blocklist management
- class bandersnatch.filter.Filter(*args: Any, **kwargs: Any)[source]¶
Bases:
object
Base Filter class
- property allowlist: SectionProxy¶
- property blocklist: SectionProxy¶
- check_match(**kwargs: Any) bool [source]¶
Check if the plugin matches based on the arguments provides.
- Returns
True if the values match a filter rule, False otherwise
- Return type
- filter(metadata: dict) bool [source]¶
Check if the plugin matches based on the package’s metadata.
- Returns
True if the values match a filter rule, False otherwise
- Return type
- name = 'filter'¶
- class bandersnatch.filter.FilterMetadataPlugin(*args: Any, **kwargs: Any)[source]¶
Bases:
bandersnatch.filter.Filter
Plugin that blocks sync operations for an entire project based on info fields.
- name = 'metadata_plugin'¶
- class bandersnatch.filter.FilterProjectPlugin(*args: Any, **kwargs: Any)[source]¶
Bases:
bandersnatch.filter.Filter
Plugin that blocks sync operations for an entire project
- name = 'project_plugin'¶
- class bandersnatch.filter.FilterReleaseFilePlugin(*args: Any, **kwargs: Any)[source]¶
Bases:
bandersnatch.filter.Filter
Plugin that modify the download of specific release or dist files
- name = 'release_file_plugin'¶
- class bandersnatch.filter.FilterReleasePlugin(*args: Any, **kwargs: Any)[source]¶
Bases:
bandersnatch.filter.Filter
Plugin that modifies the download of specific releases or dist files
- name = 'release_plugin'¶
- class bandersnatch.filter.LoadedFilters(load_all: bool = False)[source]¶
Bases:
object
A class to load all of the filters enabled
- ENTRYPOINT_GROUPS = ['bandersnatch_filter_plugins.v2.project', 'bandersnatch_filter_plugins.v2.metadata', 'bandersnatch_filter_plugins.v2.release', 'bandersnatch_filter_plugins.v2.release_file']¶
- filter_metadata_plugins() List[bandersnatch.filter.Filter] [source]¶
Load and return the metadata filtering plugin objects
- Returns
List of objects derived from the bandersnatch.filter.Filter class
- Return type
list of bandersnatch.filter.Filter
- filter_project_plugins() List[bandersnatch.filter.Filter] [source]¶
Load and return the project filtering plugin objects
- Returns
List of objects derived from the bandersnatch.filter.Filter class
- Return type
list of bandersnatch.filter.Filter
- filter_release_file_plugins() List[bandersnatch.filter.Filter] [source]¶
Load and return the release file filtering plugin objects
- Returns
List of objects derived from the bandersnatch.filter.Filter class
- Return type
list of bandersnatch.filter.Filter
- filter_release_plugins() List[bandersnatch.filter.Filter] [source]¶
Load and return the release filtering plugin objects
- Returns
List of objects derived from the bandersnatch.filter.Filter class
- Return type
list of bandersnatch.filter.Filter
bandersnatch.log module¶
- bandersnatch.log.setup_logging(args: Any) logging.StreamHandler [source]¶
bandersnatch.main module¶
- async bandersnatch.main.async_main(args: argparse.Namespace, config: configparser.ConfigParser) int [source]¶
bandersnatch.master module¶
- class bandersnatch.master.Master(url: str, timeout: float = 10.0, global_timeout: Optional[float] = 18000.0, proxy: Optional[str] = None)[source]¶
Bases:
object
- async check_for_stale_cache(path: str, required_serial: Optional[int], got_serial: Optional[int]) None [source]¶
- async get(path: str, required_serial: Optional[int], **kw: Any) AsyncGenerator[aiohttp.client_reqrep.ClientResponse, None] [source]¶
bandersnatch.mirror module¶
- class bandersnatch.mirror.BandersnatchMirror(homedir: pathlib.Path, master: bandersnatch.master.Master, storage_backend: Optional[str] = None, stop_on_error: bool = False, workers: int = 3, hash_index: bool = False, json_save: bool = False, digest_name: Optional[str] = None, root_uri: Optional[str] = None, keep_index_versions: int = 0, diff_file: Optional[Union[pathlib.Path, str]] = None, diff_append_epoch: bool = False, diff_full_path: Optional[Union[pathlib.Path, str]] = None, flock_timeout: int = 1, diff_file_list: Optional[List] = None, *, cleanup: bool = False, release_files_save: bool = True, compare_method: Optional[str] = None, download_mirror: Optional[str] = None, download_mirror_no_fallback: Optional[bool] = False)[source]¶
Bases:
bandersnatch.mirror.Mirror
- async cleanup_non_pep_503_paths(package: bandersnatch.package.Package) None [source]¶
Before 4.0 we use to store backwards compatible named dirs for older pip This function checks for them and cleans them up
- async determine_packages_to_sync() None [source]¶
Update the self.packages_to_sync to contain packages that need to be synced.
- async download_file(url: str, file_size: str, upload_time: datetime.datetime, sha256sum: str, chunk_size: int = 65536, urlpath: str = '') Optional[pathlib.Path] [source]¶
- errors = False¶
- find_package_indexes_in_dir(simple_dir: pathlib.Path) List[str] [source]¶
Given a directory that contains simple packages indexes, return a sorted list of normalized package names. This presumes every directory within is a simple package index directory.
- generate_simple_page(package: bandersnatch.package.Package) str [source]¶
- property generationfile: pathlib.Path¶
- get_simple_dirs(simple_dir: pathlib.Path) List[pathlib.Path] [source]¶
Return a list of simple index directories that should be searched for package indexes when compiling the main index page.
- json_file(package_name: str) pathlib.Path [source]¶
- json_pypi_symlink(package_name: str) pathlib.Path [source]¶
- need_index_sync = True¶
- need_wrapup = False¶
- on_error(exception: BaseException, **kwargs: Dict) None [source]¶
- populate_download_urls(release_file: Dict[str, str]) Tuple[str, List[str]] [source]¶
Populate download URLs for a certain file, possible combinations are:
download_mirror is not set: return “url” attribute from release_file
download_mirror is set, no_fallback is false: prepend “download_mirror + path” before “url”
download_mirror is set, no_fallback is true: return only “download_mirror + path”
Theoritically we are able to support multiple download mirrors by prepending more urls in the list.
- async process_package(package: bandersnatch.package.Package) None [source]¶
- save_json_metadata(package_info: Dict, name: str) bool [source]¶
Take the JSON metadata we just fetched and save to disk
- simple_directory(package: bandersnatch.package.Package) pathlib.Path [source]¶
- property statusfile: pathlib.Path¶
- async sync_release_files(package: bandersnatch.package.Package) None [source]¶
Purge + download files returning files removed + added
- sync_simple_page(package: bandersnatch.package.Package) None [source]¶
- property todolist: pathlib.Path¶
- property webdir: pathlib.Path¶
- class bandersnatch.mirror.Mirror(master: bandersnatch.master.Master, workers: int = 3)[source]¶
Bases:
object
- async determine_packages_to_sync() None [source]¶
Update the self.packages_to_sync to contain packages that need to be synced.
- now = None¶
- on_error(exception: BaseException, **kwargs: Dict) None [source]¶
- async process_package(package: bandersnatch.package.Package) None [source]¶
- async bandersnatch.mirror.mirror(config: configparser.ConfigParser, specific_packages: Optional[List[str]] = None) int [source]¶
bandersnatch.package module¶
- class bandersnatch.package.Package(name: str, serial: int = 0)[source]¶
Bases:
object
- filter_all_releases(release_filters: List[Filter]) bool [source]¶
Filter releases and removes releases that fail the filters
- filter_all_releases_files(release_file_filters: List[Filter]) bool [source]¶
Filter release files and remove empty releases after doing so.
- property release_files: List¶
bandersnatch.storage module¶
Storage management
- class bandersnatch.storage.Storage(*args: Any, config: Optional[configparser.ConfigParser] = None, **kwargs: Any)[source]¶
Bases:
object
Base Storage class
- PATH_BACKEND¶
alias of
pathlib.Path
- compare_files(file1: Union[pathlib.Path, str], file2: Union[pathlib.Path, str]) bool [source]¶
Compare two files and determine whether they contain the same data. Return True if they match
- copy_file(source: Union[pathlib.Path, str], dest: Union[pathlib.Path, str]) None [source]¶
Copy a file from source to dest
- delete(path: Union[pathlib.Path, str], dry_run: bool = False) int [source]¶
Delete the provided path.
- delete_file(path: Union[pathlib.Path, str], dry_run: bool = False) int [source]¶
Delete the provided path, recursively if necessary.
- exists(path: Union[pathlib.Path, str]) bool [source]¶
Check whether the provided path exists
- find(root: Union[pathlib.Path, str], dirs: bool = True) str [source]¶
A test helper simulating ‘find’.
Iterates over directories and filenames, given as relative paths to the root.
- get_file_size(path: Union[pathlib.Path, str]) int [source]¶
Get the size of a given path in bytes
- get_flock_path() Union[pathlib.Path, str] [source]¶
- get_hash(path: Union[pathlib.Path, str], function: str = 'sha256') str [source]¶
Get the sha256sum of a given path
- get_json_paths(name: str) Sequence[Union[pathlib.Path, str]] [source]¶
- get_lock(path: str) filelock._api.BaseFileLock [source]¶
Retrieve the appropriate FileLock backend for this storage plugin
- Parameters
path (str) – The path to use for locking
- Returns
A FileLock backend for obtaining locks
- Return type
filelock.BaseFileLock
- get_upload_time(path: Union[pathlib.Path, str]) datetime.datetime [source]¶
Get the upload time of a given path
- is_dir(path: Union[pathlib.Path, str]) bool [source]¶
Check whether the provided path is a directory.
- is_file(path: Union[pathlib.Path, str]) bool [source]¶
Check whether the provided path is a file.
- iter_dir(path: Union[pathlib.Path, str]) Generator[Union[pathlib.Path, str], None, None] [source]¶
Iterate over the path, returning the sub-paths
- mkdir(path: Union[pathlib.Path, str], exist_ok: bool = False, parents: bool = False) None [source]¶
Create the provided directory
- move_file(source: Union[pathlib.Path, str], dest: Union[pathlib.Path, str]) None [source]¶
Move a file from source to dest
- name = 'storage'¶
- open_file(path: Union[pathlib.Path, str], text: bool = True) Generator[IO, None, None] [source]¶
Yield a file context to iterate over. If text is true, open the file with ‘rb’ mode specified.
- read_file(path: Union[pathlib.Path, str], text: bool = True, encoding: str = 'utf-8', errors: Optional[str] = None) Union[str, bytes] [source]¶
Yield a file context to iterate over. If text is true, open the file with ‘rb’ mode specified.
- rewrite(filepath: Union[pathlib.Path, str], mode: str = 'w', **kw: Any) Generator[IO, None, None] [source]¶
Rewrite an existing file atomically to avoid programs running in parallel to have race conditions while reading.
- rmdir(path: Union[pathlib.Path, str], recurse: bool = False, force: bool = False, ignore_errors: bool = False, dry_run: bool = False) int [source]¶
Remove the directory. If recurse is True, allow removing empty children. If force is true, remove contents destructively.
- set_upload_time(path: Union[pathlib.Path, str], time: datetime.datetime) None [source]¶
Set the upload time of a given path
- symlink(source: Union[pathlib.Path, str], dest: Union[pathlib.Path, str]) None [source]¶
Create a symlink at dest that points back at source
- class bandersnatch.storage.StoragePlugin(*args: Any, config: Optional[configparser.ConfigParser] = None, **kwargs: Any)[source]¶
Bases:
bandersnatch.storage.Storage
Plugin that provides a storage backend for bandersnatch
- flock_path: Union[pathlib.Path, str]¶
- name = 'storage_plugin'¶
- bandersnatch.storage.load_storage_plugins(entrypoint_group: str, enabled_plugin: Optional[str] = None, config: Optional[configparser.ConfigParser] = None, clear_cache: bool = False) Set[bandersnatch.storage.Storage] [source]¶
Load all storage plugins that are registered with pkg_resources
- Parameters
entrypoint_group (str) – The entrypoint group name to load plugins from
enabled_plugin (str) – The optional enabled storage plugin to search for
config (configparser.ConfigParser) – The optional configparser instance to pass in
clear_cache (bool) – Whether to clear the plugin cache
- Returns
A list of objects derived from the Storage class
- Return type
List of Storage
- bandersnatch.storage.storage_backend_plugins(backend: Optional[str] = 'filesystem', config: Optional[configparser.ConfigParser] = None, clear_cache: bool = False) Iterable[bandersnatch.storage.Storage] [source]¶
Load and return the release filtering plugin objects
- Parameters
backend (str) – The optional enabled storage plugin to search for
config (configparser.ConfigParser) – The optional configparser instance to pass in
clear_cache (bool) – Whether to clear the plugin cache
- Returns
List of objects derived from the bandersnatch.storage.Storage class
- Return type
list of bandersnatch.storage.Storage
bandersnatch.utils module¶
- bandersnatch.utils.bandersnatch_safe_name(name: str) str [source]¶
Convert an arbitrary string to a standard distribution name Any runs of non-alphanumeric/. characters are replaced with a single ‘-‘.
This was copied from pkg_resources (part of setuptools)
bandersnatch also lower cases the returned name
- bandersnatch.utils.find(root: Union[pathlib.Path, str], dirs: bool = True) str [source]¶
A test helper simulating ‘find’.
Iterates over directories and filenames, given as relative paths to the root.
- bandersnatch.utils.hash(path: pathlib.Path, function: str = 'sha256') str [source]¶
- bandersnatch.utils.make_time_stamp() str [source]¶
Helper function that returns a timestamp suitable for use in a filename on any OS
- bandersnatch.utils.recursive_find_files(files: Set[pathlib.Path], base_dir: pathlib.Path) None [source]¶
- bandersnatch.utils.rewrite(filepath: Union[str, pathlib.Path], mode: str = 'w', **kw: Any) Generator[IO, None, None] [source]¶
Rewrite an existing file atomically to avoid programs running in parallel to have race conditions while reading.
- bandersnatch.utils.unlink_parent_dir(path: pathlib.Path) None [source]¶
Remove a file and if the dir is empty remove it
bandersnatch.verify module¶
- async bandersnatch.verify.delete_unowned_files(mirror_base: pathlib.Path, executor: concurrent.futures.thread.ThreadPoolExecutor, all_package_files: List[pathlib.Path], dry_run: bool) int [source]¶
- async bandersnatch.verify.get_latest_json(master: bandersnatch.master.Master, json_path: pathlib.Path, config: configparser.ConfigParser, executor: Optional[concurrent.futures.thread.ThreadPoolExecutor] = None, delete_removed_packages: bool = False) None [source]¶
- async bandersnatch.verify.metadata_verify(config: configparser.ConfigParser, args: argparse.Namespace) int [source]¶
Crawl all saved JSON metadata or online to check we have all packages if delete - generate a diff of unowned files
- bandersnatch.verify.on_error(stop_on_error: bool, exception: BaseException, package: str) None [source]¶
- async bandersnatch.verify.verify(master: bandersnatch.master.Master, config: configparser.ConfigParser, json_file: str, mirror_base_path: pathlib.Path, all_package_files: List[pathlib.Path], args: argparse.Namespace, executor: Optional[concurrent.futures.thread.ThreadPoolExecutor] = None, releases_key: str = 'releases') None [source]¶
- async bandersnatch.verify.verify_producer(master: bandersnatch.master.Master, config: configparser.ConfigParser, all_package_files: List[pathlib.Path], mirror_base_path: pathlib.Path, json_files: List[str], args: argparse.Namespace, executor: Optional[concurrent.futures.thread.ThreadPoolExecutor] = None) None [source]¶