There’s an interesting comment on Jared Spool’s Site Maps and Site Indexes, Revisited (a follow up to What about Site Maps and Site Indexes?) that site maps might help search engines ‘easily spider your site’. While I agree that site maps might make a good seed/starting point for a search engine, I don’t quite see how any crawling engine is going to be taken seriously if it can’t cope with a sitemap-less site. Getting a good starting point can useful in a multi-domain crawl assuming you only want one connection per domain at a time because it allows you to get up to the maximum number of threads quickly, but this isn’t usually an issue with individual sites.
That said, I find sitemaps very useful for a search engine related reason. Sitemaps very often reflect the expected navigational search queries, so they provide a good set of test data if you want to evaluate your search engine’s performance on navigational queries. We do this sort of analysis quite often as part of customer demonstrations. Grab the search results for each of the sitemap link names, and see where the page linked in the sitemap comes up, then compare the results between search engines.
Of course, navigational queries aren’t the only aspect of a search engine to consider, and this is unlikely to justify maintaining a sitemap for your site, but I for one will be disappointed if they lose popularity.
(And for reference, no, I wouldn’t often use a sitemap myself. I have occasionally resorted to them when looking for something I’m sure should exist after basic navigation and search have failed, but thankfully that’s not too often)