Today the “Kane County Chronical” is running a story about a local resident who found that Fermi Lab is located in the wrong place on Google Maps. He is quoted as saying “I wondered what was going on, It just seemed very weird.” Wierd indeed, you see Google has chosen to use Tele Atlas as thier data provider of choice. This means that a European data provider is providing map data for the United States. This is just one example of how much Tele Atlas’s data stinks! So while Google Maps is free and cool it might not always get you where you want to go.
Here is the whole article:
BATAVIA – Fermilab is a high physics laboratory where scientists are trying to answer one of the biggest questions in the universe – how does the universe work?
Now officials at Fermilab are baffled by another mystery – why does Google Maps show Fermi National Accelerator Lab as being along Route 47 between McDonald and Burlington roads when its campus is located in Batavia, 15 miles east of that location?
St. Charles resident Chris Madsen came across this mystery recently. He uses Google Maps for a Web page he runs for the Kane County Audubon Society.
“I wondered what was going on,” Madsen said. “It just seemed very weird.”
Fermilab spokesman Kurt Riesselmann is also perplexed.
“The Department of Energy doesn’t own land out there,” Riesselmann said. “I don’t know what it is, whether it is farmland or something like that.”
The DOE owns Fermilab.
Riesselmann said Fermilab officials plan to contact Google.
“We will work with Google to solve this mystery,” Riesselmann said.
Fermilab, originally named the National Accelerator Laboratory, was commissioned by the U.S. Atomic Energy Commission, under a bill signed by President Lyndon Johnson on November 21, 1967. Fermilab’s 6,800-acre site originally was home to farmland, and to the village of Weston.
Following funding cutbacks, President Bush this summer signed legislation that provides $62.5 million for the Office of Science to ensure that Fermilab, Argonne and other scientific facilities are able to continue their research and retain staff.
As an aside when I check Google maps this morning it looks like Fermi is in the proper place now.
Now that TomTom has bought Tele Atlas and there are rumors that Google may purchase Navteq, it may be time to consider a new mapping company at least in the USA. While this isn’t a new idea in itself I beleive that my method of collection the data is. Here is how I would do it.
First I would start with Tiger Data maps supplied by the US Census Bureau. While this data is not totally accurate or complete it would offer a good basis. Then I would develop a gps/gprs device that could track and report where is was. It should run off of twelve volts and be very compact. It will report its position, heading and rate of speed back to a server through gprs. Thirdly I would offer truck drivers 0.10-0.20 cents per mile for every mile they drive with the device turned on. If you get enough truck drivers or even a trucking company to have your device on their trucks. The reported GPS tracks would then be used to “correct” the tiger and bring it in line with what the gps actually reports. As an added bonus you can use the network of devices to start collecting traffic data.
So I wonder why none of the mapping companies are already doing this……..
While sitting at work today I spotted one of the Imersionmedia bugs rolling down Arlington Heights road in Arlington Heights IL. According to their website they are scheduled to be in Chicago but not this far out in the burbs. If they drive back I will try to get a snapshot of them. It would be cool if they provided real time tracking of the cars via their web page, even if it does violate Google’s TOS.
I don’t think they were snapping pictures as they cruised down the center lane instead of the right most one, but maybe that is how they operate. Pretty cool to see, amazing what you can see if you keep your eyes open.
Yesterday at work we released a new version of our next generation mapping . Some feature highlights include geop-ip, dynamic proximity searching and dynamic driving directions (you can drag the start and end points). There is a whole bunch of backend work that is went into this release.
Try it out I hope that you like it.
Melissadata is a data service provider that provides zip code data among other things. I recently had the opportunity to work with their ZIPdata. They provide thier data in downloadable .zip file. These files consist of (I assume) MSSQL dumps of thier database. Since work id not a Microsoft shop but rather and open source shop I had the pleasure of creating a Perl script that would take this data and make it appropriate to be loaded into a postres database. Now Melissadata was nice enough to separate the table creation from the data but in at least one instance the table create did not match the data. So I have some suggestions for the Melissadata team so they can offer everyone a better product
1. The *.sql files (the files with the create tables in them) should have at the minimum a standard SQL version that is appropriate for creating basic tables in MySQL PostgresSQL and Oracle.
2. The *.dat files should be exported in tab delimited format. This makes importation of this data into the chosen database really simple. This way I don’t have to parse each line based upon some string length to parse out the fields.
3. Double check that the fields that you are providing in the data files match the number of fields that are being created in the .sql file. (Hint the census.sql is missing the 62-64 age column). This broke my script because trusted that your .sql files were accurate and inline with your published users guide.
4. Your file names should match the names of the tables that the data or table create is associated with. Any standard dump (mysql_dump pg_dump) will do this for you.
Now that I have written this script I would be happy to write another one for Melissadata or anyone else who would like to import this data into their database. If you would like me to write you one let me know.
I will be attending the Where2.0 conference this year. I am really looking forward to it as this will be my first time ever to visit California. Right now I am trying to figure out where I will stay while I am out there. So I put together a Google MyMap to try and figure out where to stay. The obvious choice is in the Fairmont Hotel since that is where the convention is but the room rates are a bit steep. I hope that this map helps somebody else besides me.
Today I install OpenGTS (Open GPS Tracking System). Eseentailly this is a jsp software app that collects tracking informatio from GPS enabled devices and displays reports and the current location of the device on a google map. The system can support multiple accounts with multiple users per account. I haven’t actually entered any tracking information into the system but it seems that this part of the application is pretty complete. The part where this application is lacking is in the device and account management. As far as I know the only way to administer these are through shell scripts that interact with the database.
Overall the software was not terrible to install. The hardest part that I had was making sure I had all the various java packages that are required for the build of the war files. Every time that you want to make a configuration change however you must rebuild the main .war file. Now that it is all installed It shouldn’t be too hard to maintain
There are some enhancements to the system that might make this better. I would like to see all the administration of accounts and devices to happen through the UI. I would also like to see some way to integrate a billing system into the system. This way one would be able to have people sign up themselves and add devices and for you the owner of the service to bill for the use of the system.
My next step to actually generate some track logs and actually see how the reports look and what the google maps looks like.
The google maps team release the MyMaps application today. There is a good writeup of how to use it on the Oogle Earth Blog. I went ahead and created my own map just to try it out. It seems that this is a good effort by google to get people to try and generate content for Google. Perhaps they are trying to build a POI (points of interest) database. I personally think that this is a direct attack on services like Platial where Google is trying to keep eyes on their site such that they can deliver without paying Adsense payments.
I think that this is a nice little solution for you to exchange maps with freinds but beyond that it doesn’t do much. There is a limit of 50 markers per map for example so if you have a lot of items to but on the map this solution is not for you. You cannot embed this map into another application. So right now this is a neat little trick that may help people share maps but I don’t think that it will go anywhere else. I think that most companies will want to control thier brand such that they will not want to link off onto another site that they do not control.
I don’t remember anyone posting about this but it seems that google is playing in the realestate market. I was doing a quick google search and ran across a tool bar in my reuslts that allowed me to specify a location as well as criteria for properties that I am interested in. It seems that this service is pulling listing from at least one of the local MLS services as I see a lot of the same properites that I would see on an site like Realtor.com. Also through the use of Google Base I can add my own listing if I am a realestate professional. One last nice thing that I can do is create and RSS feed of my realestate listing search so that I can monitor new properties through my favorte RSS reader. All this coupled with google maps makes this a nice (although still not perfect) way to use the internet to search for realestate. I think that sites like Realtor.com should be worried.