All Of Wikipedia On Pandora?


Another option is with sdictviewer. I use this with my Zaurus right now. The Wiki database is stored in 3 dictionary files and sdictviewer searches all of them automatically.

Canguy
 
The index (based on when I was running this on my old laptop) takes up exactly the same amount of space as the data files. I plan to implement index compression.
 
I always thought there was a wikipedia-image without pictures. Shouldn't that one be considerable smaller then the whole wikipedia-package? Together with index-compression, that should result in a package <3Gb I'd guess.
 
Wikipedia without images is 3.7GB. Wikipedia with images (the last time I checked) is over 120 gigabytes.

Canguy said:
Another option is with sdictviewer. I use this with my Zaurus right now. The Wiki database is stored in 3 dictionary files and sdictviewer searches all of them automatically.

Canguy
I think I'd much rather use Firefox.

cb88 said:
hmmmm sudo rm -rf ./wikipedia

wonder what will happen X.x
I haven't had a chance to use wikipediafs, but I don't think anything would happen unless you're a wikipedia admin. :p
 
Last edited by a moderator:
frantically googles wikipedia exploits..... X.x

in all honesty i had a page on wikipedia for about 6months that was total BS

it was here http://en.wikipedia.org/wiki/Tachyon_parti...ass_accelerator

got deleted for being a "nonsensical hoax" LOL ... took awhile for them to get though because it looked legit

Sadly i don't have a copy of it... :-( it was hilarious
 
Last edited by a moderator:
atomicthumbs said:
ashdjones said:
You only need the database dump:

http://www.wikitaxi.org/delphi/doku.php/pr.../wikitaxi/index

I think this program makes it's own index or something. The file size doubles.



1. It's written in Delphi, and I see no source code.

2. It's for Windows and x86 only.

3. Why bother? Mine allows you to use whatever web browser you choose.


Yeah of course. I was just pointing out that a software solution can be made that utilises the database dumps alone, without having to download the index.
 
Last edited by a moderator:
Once copied onto the SD card for :pandora1: , how difficult would it be to get it updated? The data, not the app.

atomicthumbs, would your port be multilingual, meaning use it for Wikipedias from different languages? Thanks.
 
OpenTheBox said:
Once copied onto the SD card for :pandora1: , how difficult would it be to get it updated? The data, not the app.

atomicthumbs, would your port be multilingual, meaning use it for Wikipedias from different languages? Thanks.
Yes, but it might take a little while before I get to the other wikipedias. I might have them all, though. German, English, and Spanish to start.

Edit:

You just download the new package I release every once in a while. It takes a long time to download Wikipedia, and a long time to build the index.

ashdjones said:
Yeah of course. I was just pointing out that a software solution can be made that utilises the database dumps alone, without having to download the index.
You don't download the index, you build it yourself. And you can use a solution like this if you absolutely love having no full text search/full text search that takes several hours to complete.
 
Last edited by a moderator:
If/when you release updated packages, would it be possible to release diffs/patches as well? I rather doubt there are going to be several GB of changes for every update, so it'd be a waste of bandwidth to download data that's mostly the same each time the package is updated.
 
BigTruck said:
If/when you release updated packages, would it be possible to release diffs/patches as well? I rather doubt there are going to be several GB of changes for every update, so it'd be a waste of bandwidth to download data that's mostly the same each time the package is updated.
I could, but you'd have to build your own index.
 
Last edited by a moderator:
atomicthumbs said:
I'm working on a port of the Wikipedia Offline Reader, as wella s a complete package containing the most recent database dump that will be released when the Pandora is released.

That is great. I love Wikipedia and I'd really enjoy having with me everywhere and available when I need it without worrying about internet connection all the time.
 
Last edited by a moderator:
atomicthumbs said:
You don't download the index, you build it yourself. And you can use a solution like this if you absolutely love having no full text search/full text search that takes several hours to complete.
Okay I see what you mean. The program I mentioned does the same thing actually.

I'm not having much luck with it though, the download of the wikipedia archive appears to have corrupted.
 
Last edited by a moderator:
I just thought of having offline wikipedia at autostart and a big "Don't panic"-button on top of the Pandora :)
 
conso said:
I just thought of having offline wikipedia at autostart and a big "Don't panic"-button on top of the Pandora :)
Hah! I knew I couldn't be the only person here who thought of that! :p
 
Last edited by a moderator:
atomicthumbs said:
I'm already doing it. No need to duplicate my work.
How is progress? Are you still working on it? This is still one of the apps i'm most looking forward to!
:)
 
Last edited by a moderator:
.Gogeta§§J4BR. said:
atomicthumbs said:
Wikipedia without images is 3.7GB. Wikipedia with images (the last time I checked) is over 120 gigabytes.
nice to know! I wonder if you could store only the "main" images, they might be a lot useful


not true.
i have it running on my s60 phone -> 587 MB without images.
 
Last edited by a moderator:
Flat said:
not true.
i have it running on my s60 phone -> 587 MB without images.
That's most likely the reduced edition; could you point me to the source of your version? I have a s60 as well, and would be interested in it either way :)
 
Last edited by a moderator:
Back
Top