Reply
Thread Tools
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#1
Ok, I've got a slight problem I'm hoping everyone can help me with. I'm looking for a quick, easy, cut and dried way to download Virtual Earth maps and convert them to the vestreet.db format. I'm using Virtual earth, and ultimately the jpg versions of their maps rather than the other mapping programs which use png (VE also uses PNG, but they have a JPG version too which look tons better, and yes they do exist. Just change .png to .jpg extension on the url for better quality VE maps) and I'd like to be able to download the entire lower peninsula of Michigan onto my computer, process it, then convert it to the vestreet.db format. The idea behind that is, A) my computer is gobloads faster than my N810, so it can process maps faster, and B) it can download them faster. I'm using an AthlonXP3000 with 1gb of ram, so that leaves my N810 in the dust in processing power. Hence why I want to use my computer to compile the maps.

The other option is to get a complete ready to use Virtual Earth database for the Eastern US, or even the world. Space is not an issue, since I have an 8gb add in card I can use. I know someone wrote a perl script once that would convert google maps to a ready to use maemo mapper database. I'm hoping someone did something like that for Virtual Earth. Any help is appreciated. I looked around for other possible solutions on the forums (gawds, I searched for like 45 minutes) but didn't find anything, so I'm hoping someone can help. Thanks.
 
Posts: 2,102 | Thanked: 1,309 times | Joined on Sep 2006
#2
Take a peak at the source code to see what type of database is used, and the format of the data.

I agree that it would be useful to have a PC downloader to do map grabbing (especially before going on a big trip). It shouldn't be too difficult, anyone have some spare time to write one?
 
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#3
I looked at the source already (felt like hacking it myself already) but got nothing but gibberish and stuff that didn't make sense.
 
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#4
Ok, here's what I've figured out. I'm willing to do the work on a perl or php script that downloads and converts all of the map pieces so that they can be used on the maemo mapper, but downloaded and processed on a PC, which is much faster. The problem is, I don't know the first thing about the API at all. I don't know what the formating of the database is or how it's structured, or a lot of things about how all that works based on gps data.

I have however, figured out this much. Take the Virtual Earth url's for example. GPS data is converted from the GPS id, to a small 10 digit number formatted like this: xxxxx-xxxxx

We enter the base url into maemo mapper like this:

http://r0.ortho.tiles.virtualearth.n.../r%0s.jpg?g=45

And when Maemo Mapper grabs one of the map images, it converts the above information into something like this:

http://r0.ortho.tiles.virtualearth.n...03020.jpg?g=45

So, somewhere along the line item A is converted into item B. I'd love to get ahold of the developer for Maemo Mapper, but I couldn't find an address for him. If it were possible to contact him, I might be able to figure out how the database is written out, and if I could do that, then I could possibly develop a script myself. So I'm not against doing the work myself, I just have no clue what I'm doing (API wise anyways) in order to achieve the end result, and I want to be sure nobody else has already done the work.

If someone's willing to help me somehow, I will even try to write downloaders for all the other repositories if possible as thanks.
 
gnuite's Avatar
Posts: 1,245 | Thanked: 421 times | Joined on Dec 2005
#5
The database is in a very simple format. It's a GDBM database, and the relevant code to look at for the key/value pair is around line 800 of the maps.c file.

The key is a simple 12-byte set of three 4-byte (network byte order) integers: zoom, tilex, and tiley. The value is just the binary contents of the PNG or JPEG image.

The biggest problem that you might run into is that GDBM's database format is not cross-platform-friendly. The implementation is endian-specific, and so a database built on i386 will not work on a big-endian architecture like the internet tablet's ARM. You should be able to get around this with emulation, but I've never tried it.

GDBM was chosen over sqlite3 (a platform-independent database implementation) for performance and disk-space reasons (sqlite3 was a lot slower, and the resulting database moderately larger). With the relatively-recent caching implementation, though, it might be worth switching to sqlite3, or maybe offering a choice between the two.
 
Posts: 2,102 | Thanked: 1,309 times | Joined on Sep 2006
#6
The implementation is endian-specific, and so a database built on i386 will not work on a big-endian architecture like the internet tablet's ARM.
Both ARM and x86 are littleendian afaik (yes ARM can be either, but we run in littleendian mode on the tablets).
 

The Following User Says Thank You to lardman For This Useful Post:
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#7
Ah, thanks for the info gnuite, that explains the whole numbering scheme a lot more. I'll have to dig through that file to get an idea of how he does it, then convert that to perl somehow.

lardman: Ah, so I could in theory build this on Linux, then just copy it over to the N810 without much trouble. Sweet.
 
Posts: 2,102 | Thanked: 1,309 times | Joined on Sep 2006
#8
If by copy you mean "compile in scratchbox and copy over" then yes

There may still be padding issues in the database, you'd need to do some testing to see whether there are other platform specific issues to overcome.
 
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#9
Good lord, has anyone ever mentioned how much of a PITA gdbm is to work with? >.<;; Even the API leaves you going WTF? I'd say this is going to take me a while to figure out, so don't expect anymore updates from me for a while. But I'm now flatly determined to get this sorted out, so I'm not gonna stop until I either rot the last of my brain cells (heh. that won't take much these days) or I figure this darned thing out.

Kinda wish that GDBM was as easy to work with as MySQL. And on the same note, it's too bad MySQL isn't lite enough to be used in a senario like this. It'd certainly make my life a whole lot easier.
 

The Following User Says Thank You to Lord Raiden For This Useful Post:
Lord Raiden's Avatar
Posts: 1,562 | Thanked: 349 times | Joined on Jun 2008
#10
Ya know, I've been going over some of the data so far with the map tiles system, and based on my current estimates, the Google map files repository must be in the 1000 terrabyte category. Seriously. Using an average file size of just 3kb per tile (files range from 312bytes up to 50kb depending on level of detail) it comes out to somewhere around 192+ gig just for the state of Michigan! 0.0 That's well over 2.8 million tiles! I'd hate to think how many gigs it would take just to do the Eastern US. That'd be seriously scary.

If that's actually true, or even remotely close, then maps on the fly is starting to look better by the day. >.<;;
 
Reply


 
Forum Jump


All times are GMT. The time now is 01:39.