SSD-friendly FS crawler for the Aquatic BitTorrent tracker, based on librqbit API https://crates.io/crates/aquatic-crawler
Find a file
2025-06-07 17:04:12 +03:00
.github initial commit 2025-06-07 12:28:02 +03:00
src rename argument torrents_path to storage, infohash_source to infohash_file (reserve namespace for the URL sources) 2025-06-07 17:04:12 +03:00
.gitignore initial commit 2025-06-07 12:28:02 +03:00
Cargo.toml add enable_upnp_port_forwarding, enable_upload, socks_proxy_url arguments 2025-06-07 16:05:58 +03:00
LICENSE Initial commit 2025-06-06 12:49:40 +03:00
README.md rename argument torrents_path to storage, infohash_source to infohash_file (reserve namespace for the URL sources) 2025-06-07 17:04:12 +03:00

aquatic-crawler

Build Dependencies crates.io

Crawler/aggregation tool for the Aquatic BitTorrent tracker API.

Note

Project in development!

Roadmap

  • Targets supported
    • IPv4/IPv6 info-hash JSON/API (requires PR#233)
      • local file path
      • remote URL
  • Storage
    • File system (dump as .torrent)
      • V1
      • V2
    • Manticore full text search
    • SQLite

Install

  1. git clone https://github.com/YGGverse/aquatic-crawler.git && cd aquatic-crawler
  2. cargo build --release
  3. sudo install target/release/aquatic-crawler /usr/local/bin/aquatic-crawler

Usage

aquatic-crawler --infohash-file   /path/to/info-hash-ipv4.json\
                --infohash-file   /path/to/info-hash-ipv6.json\
                --infohash-file   /path/to/another-source.json\
                --torrent-tracker udp://host1:port\
                --torrent-tracker udp://host2:port\
                --storage         /path/to/storage
  • all arguments are optional, to support multiple source and target drivers
  • running without arguments does nothing!

Options

-d, --debug <DEBUG>
        Debug level

        * `e` - error * `i` - info

        [default: ei]

-c, --clear
        Clear previous index collected on crawl session start

--infohash-file <INFOHASH_FILE>
        Absolute filename(s) to the Aquatic tracker info-hash JSON/API

        * PR#233 feature

--storage <STORAGE>
        Directory path to store reload data (e.g. `.torrent` files)

--torrent-tracker <TORRENT_TRACKER>
        Define custom tracker(s) to preload the `.torrent` files info

--initial-peer <INITIAL_PEER>
        Define initial peer(s) to preload the `.torrent` files info

--enable-dht
        Enable DHT resolver

--enable-upnp-port-forwarding
        Enable UPnP

--enable-upload
        Enable upload

--socks-proxy-url <SOCKS_PROXY_URL>
        Use `socks5://[username:password@]host:port`

-s <SLEEP>
        Crawl loop delay in seconds

        [default: 300]

-h, --help
        Print help (see a summary with '-h')

-V, --version
        Print version