在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称:ipfs/distributed-wikipedia-mirror开源软件地址:https://github.com/ipfs/distributed-wikipedia-mirror开源编程语言:TypeScript 45.2%开源软件介绍:Distributed Wikipedia Mirror Project
Putting Wikipedia Snapshots on IPFS and working towards making it fully read-write.
Existing MirrorsThere are various ways one can access the mirrors: through a DNSLink, public gateway or directly with a CID. You can read all about the available methods here. DNSLinks
CIDsThe latest CIDs that the DNSLinks point at can be found in snapshot-hashes.yml. Each mirror has a link to the original Kiwix ZIM archive in the footer. It can be dowloaded and opened offline with the Kiwix Reader. Table of ContentsPurpose“We believe that information—knowledge—makes the world better. That when we ask questions, get the facts, and are able to understand all perspectives on an issue, it allows us to build the foundation for a more just and tolerant society” -- Katherine Maher, Executive Director of the Wikimedia Foundation Wikipedia on IPFS -- BackgroundWhat does it mean to put Wikipedia on IPFS?The idea of putting Wikipedia on IPFS has been around for a while. Every few months or so someone revives the threads. You can find such discussions in this github issue about archiving wikipedia, this issue about possible integrations with Wikipedia, and this proposal for a new project. We have two consecutive goals regarding Wikipedia on IPFS: Our first goal is to create periodic read-only snapshots of Wikipedia. A second goal will be to create a full-fledged read-write version of Wikipedia. This second goal would connect with the Wikimedia Foundation’s bigger, longer-running conversation about decentralizing Wikipedia, which you can read about at https://strategy.wikimedia.org/wiki/Proposal:Distributed_Wikipedia (Goal 1) Read-Only Wikipedia on IPFSThe easy way to get Wikipedia content on IPFS is to periodically -- say every week -- take snapshots of all the content and add it to IPFS. That way the majority of Wikipedia users -- who only read wikipedia and don’t edit -- could use all the information on wikipedia with all the benefits of IPFS. Users couldn't edit it, but users could download and archive swaths of articles, or even the whole thing. People could serve it to each other peer-to-peer, reducing the bandwidth load on Wikipedia servers. People could even distribute it to each other in closed, censored, or resource-constrained networks -- with IPFS, peers do not need to be connected to the original source of the content, being connected to anyone who has the content is enough. Effectively, the content can jump from computer to computer in a peer-to-peer way, and avoid having to connect to the content source or even the internet backbone. We've been in discussions with many groups about the potential of this kind of thing, and how it could help billions of people around the world to access information better -- either free of censorship, or circumventing serious bandwidth or latency constraints. So far, we have achieved part of this goal: we have static snapshots of all of Wikipedia on IPFS. This is already a huge result that will help people access, keep, archive, cite, and distribute lots of content. In particular, we hope that this distribution helps people in Turkey, who find themselves in a tough situation. We are still working out a process to continue updating these snapshots, we hope to have someone at Wikimedia in the loop as they are the authoritative source of the content. If you could help with this, please get in touch with us at [email protected]. (Goal 2) Fully Read-Write Wikipedia on IPFSThe long term goal is to get the full-fledged read-write Wikipedia to work on top of IPFS. This is much more difficult because for a read-write application like Wikipedia to leverage the distributed nature of IPFS, we need to change how the applications write data. A read-write wikipedia on IPFS would allow it to be completely decentralized, and create an extremely difficult to censor operation. In addition to all the benefits of the static version above, the users of a read-write Wikipedia on IPFS could write content from anywhere and publish it, even without being directly connected to any wikipedia.org servers. There would be automatic version control and version history archiving. We could allow people to view, edit, and publish in completely encrypted contexts, which is important to people in highly repressive regions of the world. A full read-write version (2) would require a strong collaboration with Wikipedia.org itself, and finishing work on important dynamic content challenges -- we are working on all the technology (2) needs, but it's not ready for prime-time yet. We will update when it is. How to add new Wikipedia snapshots to IPFSThe process can be nearly fully automated, however it consists of many stages and understanding what happens during each stage is paramount if ZIM format changes and our build toolchain requires a debug and update.
Note: This is a work in progress.. We intend to make it easy for anyone to
create their own wikipedia snapshots and add them to IPFS, making sure those
builds are deterministic and auditable, but our first emphasis has been to get
the initial snapshots onto the network. This means some of the steps aren't as
easy as we want them to be. If you run into trouble, seek help through a github
issue, commenting in the Manual buildIf you would like to create an updated Wikipedia snapshot on IPFS, you can follow these steps. Step 0: Clone this repositoryAll commands assume to be run inside a cloned version of this repository Clone the distributed-wikipedia-mirror git repository $ git clone https://github.com/ipfs/distributed-wikipedia-mirror.git then $ cd distributed-wikipedia-mirror Step 1: Install dependencies
Install the node dependencies: $ yarn Then, download the latest zim-tools and add Step 2: Configure your IPFS NodeIt is advised to use separate IPFS node for this: $ export IPFS_PATH=/path/to/IPFS_PATH_WIKIPEDIA_MIRROR
$ ipfs init -p server,local-discovery,flatfs,randomports --empty-repo Tune DHT for speedWikipedia has a lot of blocks, to publish them as fast as possible, enable Accelerated DHT Client: $ ipfs config --json Experimental.AcceleratedDHTClient true Tune datastore for speedMake sure repo uses $ ipfs config --json 'Datastore.Spec.mounts' "$(ipfs config 'Datastore.Spec.mounts' | jq -c '.[0].child.sync=false')" NOTE: While badgerv1 datastore is faster is nome configurations, we choose to avoid using it with bigger builds like English because of memory issues due to the number of files. Potential workaround is to use HAMT shardingMake sure you use go-ipfs 0.12 or later, it has automatic sharding of big directories. Step 3: Download the latest snapshot from kiwix.orgSource of ZIM files is at https://download.kiwix.org/zim/wikipedia/
Make sure you download To automate this, you can also use the First, download the latest wiki lists using After that create a download command using Download command:
$ ./tools/getzim.sh download wikipedia wikipedia tr all maxi latest Running the command will download the choosen zim file to the Step 4: Unpack the ZIM snapshotUnpack the ZIM snapshot using $ zimdump dump ./snapshots/wikipedia_tr_all_maxi_2021-01.zim --dir ./tmp/wikipedia_tr_all_maxi_2021-01
Step 5: Convert the unpacked zim directory to a website with mirror infoIMPORTANT: The snapshots must say who disseminated them. This effort to mirror Wikipedia snapshots is not affiliated with the Wikimedia foundation and is not connected to the volunteers whose contributions are contained in the snapshots. The snapshots must include information explaining that they were created and disseminated by independent parties, not by Wikipedia. The conversion to a working website and the appending of necessary information is is done by the node program under $ node ./bin/run --help The program requires main page for ZIM and online versions as one of inputs. For instance, the ZIM file for Turkish Wikipedia has a main page of To determine the original main page use $ ./tools/find_main_page_name.sh tr.wikiquote.org
Anasayfa To determine the main page in ZIM file open in in a Kiwix reader or use $ zimdump info wikipedia_tr_all_maxi_2021-01.zim
count-entries: 1088190
uuid: 840fc82f-8f14-e11e-c185-6112dba6782e
cluster count: 5288
checksum: 50113b4f4ef5ddb62596d361e0707f79
main page: A/Kullanıcı:The_other_Kiwix_guy/Landing
favicon: -/favicon
$ zimdump info wikipedia_tr_all_maxi_2021-01.zim | grep -oP 'main page: A/\K\S+'
Kullanıcı:The_other_Kiwix_guy/Landing The conversion is done on the unpacked zim directory: node ./bin/run ./tmp/wikipedia_tr_all_maxi_2021-02 \
--hostingdnsdomain=tr.wikipedia-on-ipfs.org \
--zimfile=./snapshots/wikipedia_tr_all_maxi_2021-02.zim \
--kiwixmainpage=Kullanıcı:The_other_Kiwix_guy/Landing \
--mainpage=Anasayfa Step 6: Import website directory to IPFSIncrease the limitation of opening filesIn some cases, you will meet an error like ulimit -n 65536 Add immutable copyAdd all the data to your node using $ ipfs add -r --cid-version 1 --offline $unpacked_wiki Save the last hash of the output from the above process. It is the CID of the website. Step 7: Share the root CIDShare the CID of your new snapshot so people can access it and replicate it onto their machines. Step 8: Update *.wikipedia-on-ipfs.orgMake sure at least two full reliable copies exist before updating DNSLink. mirrorzim.shIt is possible to automate steps 3-6 via a wrapper script named To see how the script behaves try running it on one of the smallest wikis, such as $ ./mirrorzim.sh --languagecode=cu --wikitype=wikipedia --hostingdnsdomain=cu.wikipedia-on-ipfs.org Docker buildA To build the docker image: docker build . -t distributed-wikipedia-mirror-build To use it as a development environment: docker run -it -v $(pwd):/root/distributed-wikipedia-mirror --net=host --entrypoint bash distributed-wikipedia-mirror-build How to HelpIf you don't mind command line interface and have a lot of disk space, bandwidth, or code skills, continue reading. Share mirror CID with people who can't trust DNSSharing a CID instead of a DNS name is useful when DNS is not reliable or trustworthy. The latest CID for specific language mirror can be found via DNSLink: $ ipfs resolve -r /ipns/tr.wikipedia-on-ipfs.org
/ipfs/bafy.. CID can then be opened via You can also try Brave browser, which ships with native support for IPFS. Cohost a lazy copyUsing MFS makes it easier to protect snapshots from being garbage collected than low level pinning because you can assign meaningful names and it won't prefetch any blocks unless you explicitly ask. Every mirrored Wikipedia article you visit will be added to your lazy copy, and will be contributing to your partial mirror. , and you won't need to host the entire thing. To cohost a lazy copy, execute: $ export LNG="tr"
$ ipfs files mkdir -p /wikipedia-mirror/$LNG
$ ipfs files cp $(ipfs resolve -r /ipns/$LNG.wikipedia-on-ipfs.org) /wikipedia-mirror/$LNG/$LNG_$(date +%F_%T) Then simply start browsing the Cohost a full copySteps are the same as for a lazy copy, but you execute additional preload after a lazy copy is in place: $ # export LNG="tr"
$ ipfs refs -r /ipns/$LNG.wikipedia-on-ipfs.org Before you execute this, check if you have enough disk space to fit $ # export LNG="tr"
$ ipfs object stat --human /ipns/$LNG.wikipedia-on-ipfs.org ...rror MM?fix/build-2021
NumLinks: 5
BlockSize: 281
LinksSize: 251
DataSize: 30
CumulativeSize: 15 GB We are working on improving deduplication between snapshots, but for now YMMV. CodeIf you would like to contribute more to this effort, look at the issues in this github repo. Especially check for issues marked with the "wishlist" label and issues marked "help wanted". |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论