• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

urbanguacamole/torrent-paradise: Decentralized DHT search site for IPFS

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

urbanguacamole/torrent-paradise

开源软件地址:

https://github.com/urbanguacamole/torrent-paradise

开源编程语言:

JavaScript 55.3%

开源软件介绍:

About

What is this?

If you don't know what Torrent Paradise/nextgen is, see the website.

This is a repository of all the tools I use to build and run torrent-paradise.ml. The 'code name' of the project is nextgen (next gen torrent search), so don't be surprised if it comes up somewhere.

Setup

Here's what the setup looks like rn:

  • VPS, Debian Bullseye, 8 GB RAM
    • user with username nextgen on the server
  • my laptop w/ Linux
    • Go toolchain installed
    • node & npm
    • Python 3 (required only for index-generator/fix-metajson.py)

Read the server-setup.md file for more precise info.

The programs create their own tables in the DB that they need. Database name is "nextgen". You need to create the materialized views (fresh and search). You can find some useful SQL code in snippets.sql.

Each of the daemons (api, crawl-rss, seedleech-daemon) is its own standalone Go package and resulting binary. You have to compile the binaries yourself. There are systemd .service files available for each of the daemons.

The torrent collection is a mashup of the (now no longer provided) TPB dumps, my own DHT spidering efforts, and magnetico community database dumps.

The easiest way to get your own site up and running is to start with my .csv dump. It should be easy to import into any kind of system. It contains seed/leech counts too (!). If I were to import it, I'd modify import-magnetico-db.

Torrent Paradise csv dump: MEGA BayFiles IPFS

old dump (2020): MEGA

Torrent Paradise pg_dump (database): MEGA BayFiles

Usage

Generate the index

See update-index.sh.

Generation of the IPFS index will prob take a long time, a machine with high single-core perf recommended (ipfsearch runs on node.js)

Spider the DHT

Run go build in spider/ to compile and scp the binary it to the server. You can use the systemd service file in spider/spider.service to start the spider on server boot.

Scraping trackers for seed/leech data

Run go build in seedleech-daemon/ to compile and scp the binary it to the server. You can use the systemd service file in seedleech-daemon/seedleech.service.

Import a recent magnetico community dump

Use sqlite3 on a the decompressed dump to generate a .csv file. Format: infohash,name,length(bytes). Optionally quoted.

Then use the go binary in import-magnetico-db to do the import.

IPFS vs 'static'

The directory website gets deployed to IPFS, static gets deployed to the server. Static calls the API, the IPFS version doesn't.

Contributing

Before working on something, open an issue to ask if it would be okay. I would love to KISS.




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap