Slow Performance issue

Topics: Windows Forms
Feb 18, 2011 at 10:40 PM
Edited Feb 18, 2011 at 11:11 PM

I have a .CSV POI file which I download, parse and convert to collection of POI objects, about 2000 of them (Starbucks locations all over US)
All that works fine, untill I start adding the objects to the layer as markers (including MouseOver tooltip for each).

The adding process takes quite a while (few minutes) after which any and all performance of the map makes it unusable - any opration such as drag the map, zooming, etc, takes 30-60 seconds each...
My collection of POI objects is an array of objects... Do you think this is because it is an array vs (Hashtable for example) or it is intrinsic performance issue of the GMap.NET due to large number of markers?

 Also it seems that the app memory consumption shot up to 1.2GB RAM from a "normal" of 100MB-200MB...

Coordinator
Feb 18, 2011 at 11:24 PM

can you fix it? ;} ..anyway why do you need such many markers, and what version do you use, wpf?

Feb 19, 2011 at 12:01 AM
Edited Feb 19, 2011 at 12:06 AM

Well, it a mapping app, isn't it? so I use free POI files, such as Starbucks in US, Shell gas stations, Department stores, etc, for source of data. They sometimes contain several thousands locations in the form of Lat,Lng,Name,address.
So I parse .csv file into C# objects that I create and add to Dictionary. Then I loop through the dictionary and for every location I create a marker with a toolip and add it to the layer...

Fix it? well, I wish I know at this point where this performance hit is coming from. It does not look that it is my processing... since after the initial display I have issues in any operation on the map - like tooltip is displayed after 30 seconds. In addition I see some disk activity... may it be markers caching?

I use WinForms, NOT WPF...

I guess at this point I will start from 100 points and start incrementing it by 100 every new run... how would you suggest detecting where the bottleneck is?

UPDATE:  when performance is sometime more acceptable I am also getting force break in GetImageFrom function:

            if(UseMemoryCache)
            {
               MemoryStream m = GetTileFromMemoryCache(new RawTile(type, pos, zoom));
               if(m != null)
               {
                  if(GMaps.Instance.ImageProxy != null)
                  {
                     ret = GMaps.Instance.ImageProxy.FromStream(m);
                     if(ret == null)
                     {
#if DEBUG
                        Debug.WriteLine("Image disposed in MemoryCache o.O, should never happen ;} " + new RawTile(type, pos, zoom));
                        if(Debugger.IsAttached)
                        {
                           Debugger.Break();
                        }
#endif

#if !PocketPC
                        m.Dispose();
#else
                        (m as IDisposable).Dispose();
#endif
                     }
                  }
               }
Coordinator
Feb 19, 2011 at 11:17 AM

i see, what changeset do you use?

The issue is that current version recalculate markers position when you drag, so if you have 1000 objects, there will be 1000 calculations for each pixel, i have another version in optimization branch which doesn't do that, but simply changes render offset without recalculation, but ;] it has some limitations so i have create virtual center for each 100k px, because gdi/wpf use damn float precision and the tiles renders overlapped, and it isn't complete yet ;/

Also there is floating idea about clustering the markers, so it displays only some portion of all markers for current zoom level, zero progress on this so far ;}

Feb 19, 2011 at 4:57 PM

I am not sure, the assembly version of the Core is 1.5.3.3

I guess I am out of luck... my only consolation is that the problem is not in my code....so I am limited probably to a few hundred markers...the only solution in this case is for the user to indicate first the local area where he/she wants the data to show up...

clustering markers sounds ok....I'd take the visible area and add the width/height from all 4 sides of the view and would recalc everything there - that is recalculating only the visible area....although sometimes it could be quite a lot if it is zoomed out...

Feb 27, 2011 at 5:04 PM

Although its not a true answer the way i get around the same problem is that i do a distance check to the current marker and only show those markers within range

I dont reeay  need to kno if theres a starbucks 200miles from my current position just the one within 10 miles (:

Feb 27, 2011 at 10:31 PM
Edited Feb 27, 2011 at 10:33 PM

I agree that as a workaround it is ok (based on the distance), however it is not ok for scenarios where the user is not looking the immediate vicinity, but for a global picture...
In my particular case I am trying to display many different categories of info. So, starbuck is propbably relevant to immediate vicinity of the user but if I am trying to display info for demographic purposes then i need to see all 6,000 point thourghout the US...bummer!

Or say you trying to plan a trip and display all resting areas for big RVs... or some other info thrught US.
what bothers me is that it gets into Debugger.Break() - how would I know the number of points to limit myself? trial and error? what if some other machine has less memory?