Dreams, Resilience and Making a Difference

Noreen Whysel’s address to the 2017 Initiate Class of the Beta Phi Mu Theta honor society at Pratt Institute School of Information given on May 17, 2017. Slides are available at Google Slides.

Thanks to Dean Tula Giannini, Vinette Thomas, Beta Phi Mu initiates, graduates and guests. And especially to Karen Erani for inviting me to speak today. This is an honor.

Today I am going to talk about Dreams, Resilience and Making a Difference. Our goal as we embark on this journey is to make a difference. Whether we leave Pratt to become a school librarian, a legal or medical librarian, a UX designer, an archivist, we do so to serve the information needs of some group of people.

We came with our dreams of what that life will be like. We study, we make sense of all the messes. (I think I see every class I took here in this picture)… and we deliver a neatly organized and usable semblance of information our users and patrons need.

But between our dreams and our goal of making a difference is resilience. Resilience is a quality that allows us to cope with whatever the world throws at us. And because we stand between the deluge of Information and the people we serve sometimes it can feel like this [photo of lone house after a flood]. We hope to be strong like this house built to survive the floodwaters of Hurricane Ike in July 2008. We don’t expect to face this exact scenario of course….

800px-Home_designed_to_resist_flood_waters

Usually, it’s a smaller disaster, a mess that you wish was neater…Even this [photo of moldy files] is probably more than most of us will ever deal with, but we studied to learn the frameworks for sorting through whatever is thrown at us, and we persevere.

So let’s unpack this. Resilience. It’s the capacity to recover quickly from difficulties; toughness. In materials science, it’s elasticity or the ability of a substance or object to spring back into shape. We call this “bouncing back” for a reason.

So, when I told Karen that I wanted to talk about resilience, I didn’t know that was also going to be the subject of Adam Grant’s address to the graduating class at Utah State University last weekend. I guess it’s a common theme.

You may know that Adam Grant is a professor of organizational psychology at the Wharton School at the University of Pennsylvania, and that he recently published a book, Option B, with Facebook’s Sheryl Sandberg on the topic of resilience. Grant’s speech reviewed typical topics for commencement addresses and boiled them down to three virtues: generosity, authenticity, and grit, for which resilience is the key component.

  • Being generous on the days when you lose faith in humanity
  • Staying true to yourself on the days when others lose faith in you
  • Persevering on the days when you lose faith in yourself

In Grant’s words, too much of any of these three qualities diminishes your ability to bounce back from adversities. We may think that grit resembles resilience the most. Toughness and an ability to persevere can get you through trouble, but go too far and you are no longer able to help others or align your actions with the dreams that make you who you are. If you are too tough you can’t bounce back. If you are too generous, you may lose yourself.

I had trouble with the idea of being too authentic, but maybe it has to do with holding to tightly onto ways that have worked in the past that may not be helpful in the current situation. We’ll get back to this. But enough of Adam. Back to my talk.

Resilience is the quality that lets you follow your dreams so you can make a difference. It’s more than grit (and this is where my presentation departs a bit from Adam Grant’s). To practice resilience, you need to have an action plan for when things don’t go your way and another plan for mitigating the bad things that do inevitably happen [See NYC’s Ready New York Guides]. This is essential practice in emergency management, which is an area I have studied for many years, predating my time at Pratt.

If you are safe, whether that means financially or physically secure, you are in a better place to help others. If you are mindful, you can understand where your needs and capabilities fit into a given situation, and where you don’t, or where you may need to ask for help. And with a solid plan, you have a framework for doing your best even if it is something you haven’t done before or aren’t sure you are up to.

Security and planning are the same in institutional resilience. There are elements to mindfulness in institutional resilience but it manifests itself as a kind of transparency and situational awareness that is common throughout the team and the partners dealing with an incident. Emergency responders call this a COP: Common Operating Plan (or Picture). It’s a playbook that everyone knows by heart and can be augmented by information technologies.

I came to Pratt for guidance on the frameworks that help to sort information, particularly about the resilience of Cities, because I, along with many other GIS people who had volunteered at the 9/11 rescue and recovery, had a dream to ensure that the work of those who mapped the disaster would be preserved and understood as a component of our city’s core resilience.

9/11 was a difficult experience to go through—I don’t know how many of you were in NY at the time—But while it was unique in its own way, disasters of its magnitude are not uncommon here in the US and worldwide. Whether man-made disasters like 9/11/2001 or natural disasters like Hurricane Katrina’s devastation on the Gulf Coast in 2005; or a combination, as in the Tohoku Tsunami that led to a nuclear meltdown in Fukushima, Japan in 2013. Preparation for an emergency event begins with gathering resources, mapping them, and ensuring that the action plan is delivered to the right people.

Fireman soot

What happened next was a kind of mass, volunteer mobilization that could never have happened by the book. I was a part of a GIS user group called GISMO, who had been working slowly and not particularly successfully to get city agencies to exchange maps and underlying data. Unfortunately, we weren’t really prepared for this magnitude of devastation. But we had some hope and some really smart people, who were already figuring these things out.

The first meetings in response to the WTC attacks took place at the Department of Environmental Protection, who had responsibility for water, sewer and air quality systems throughout the city, which were particularly vulnerable. It soon became clear that a larger space would be needed to produce the maps and information required by emergency response teams. The Emergency Mapping and Data Center, or EMDC, was established on Pier 92 on the Hudson River and served as a headquarters for the rescue, recovery and mitigation efforts of city, federal and military teams.

These initial efforts and the partnerships that arose out of the EMDC formed what would become policies, toolsets and a “common operating picture” that would prepare the City for future incidents requiring collaboration among many different agencies and partners.

Innovations in response processes, tools and equipment have been documented and were presented at a ten-year retrospective held at the Technology in Government conference in 2011, called the NYC GeoSymposium 2001-2011-2021.

This is a poster I created for the symposium outlining ten years of incidents reported by the Office of Emergency Management:

OEM-Timeline-Detail

Here is a detail from my ArcGIS Explorer presentation:

WTC-ArcExplorer-Example

Some of the tools and artifacts that were created include updates to the very first citywide basemap, to be called NYCMAP. This map, first shot in 1999, combined aerial photography with street and building data to give a bird’s eye view of the City and its surface infrastructure. NYCMAP has developed into many versions of publicly accessible maps that are now available on the City Planning department’s website. For example, the Hurricane Evacuation Zone Finder was created in 2006 in response to Hurricane Katrina. During Hurricane Irene in 2011, WNYC.org and The NY Times created their own versions of maps that users could update with their own conditions reports.

After the WTC attack, a new Office of Emergency Management was built in Brooklyn, away from City Offices but with quick access to downtown Manhattan via the Brooklyn Bridge. It was originally created as an office of the Mayor but has since become a fully fledged Emergency Management department. Here is the floor plan of New York City’s Emergency Operations Center, located in Brooklyn near Cadman Plaza north of the courthouses.

OEM-Floor-Plan

It gives a sense of how various response partners are organized on site. It’s sort of a physical information architecture. During a large-scale event, including weather events, multiple agencies are on hand to inform and take guidance from Emergency Management. Agencies are grouped by type of service with GIS at the “prow” and Admin/Logistics in back, with public (left) and infrastructure groups (right) flanking the Command Station. This space is used during active incidents. The Watch Command Center is Operational at all times.

Here are some photographs of what these facilities look like:

OEM-Photos

This is what a command center looks like at individual departments like FDNY:

FDNY-Command-Center

One of the results of allowing a large-scale volunteer collaboration like we had at Pier 92 (perhaps also due to the huge economic hit 9/11 had on our City) and a convergence of new technology and crowd-sourcing solutions was an increase in transparency of data and citizen participation.

This included open data initiatives from Federal to local levels, nationwide, app contests, hackathons and growing participation from citizen mappers and data scientists. This Year’s BigApps Contest will present its Finalist Expo and Awards Ceremony at Civic Hall on May 23. Go to Bigapps.nyc for tickets. They will run out quickly.

Notify NYC was another effort to inform citizens of localized incidents, via phone, web, email and SMS. Staffed by OEM Watch Commanders, Notify NYC is also available via Twitter & RSS. Multichannel public communications, including social media, allow citizens to connect with government agencies, report nuisances like rats and electric outages and access emergency preparedness resources.

So back to the dream my colleagues at GISMO and I had about creating a center of geospatial information. It’s becoming a reality.

The Center for Geospatial Innovation has been created with funding from the Fund for the City of New York. Alan Leidner, former GIS Director and Assistant Commissioner of the NYC Department of Information Technology and Telecommunications is the director. A 9/11 Geospatial Archive is a key project along with the Coalition of Geospatial Information Technology Organizations, or COGITO, which I am coordinating with additional funding from FCNY. We have collected over 650 digital and physical items including videos, maps and electronic geospatial data, as well as all of the presentations from the 2011 NYC GeoSymposium and other events.

Here are examples of some of the materials we have collected.

  • Maps of Restriction zones and affected facilities prepared by the FDNY and geographers at the Emergency Mapping Center at Pier 92.
  • Aerial photos.
  • LIDAR images showing the extent of damage. (These were created at the Emergency Mapping and Data Center on September 17).
  • We also have heat maps showing the extent of the fires burning beneath the rubble.
  • Maps showing the Structural Status of buildings in the vicinity of the attack. (These were created on September 21, 2001 by Urban Data Solutions, a commercial partner).
  • Maps of recovered personal objects and human remains.
  • We also have a large number of photographs of activity at the Emergency Mapping and Data Center at Pier 92. [Alan Leidner is in the white shirt and beard over here on the left].

EMDC-1 EMDC-2

We have been able to collect names of geographers who participated in rescue and recovery from sign-in sheets, meeting notes and other documentation. LinkedIn has been a great way to find out where people who participated then are now, so we can interview them to discover additional artifacts that may be hidden in personal or official collections. We also have video interviews from the week following 9/11 identifying participants.

Handwritten notes and sign-in sheets from Department of Environmental Protection

An interesting document outlined the chronology of activities from September 11 to October 12. This document contains information about participants and lessons learned in the weeks following the attack. This is resilience in action, since it was deliberately created at a time of crisis but forms policies and planning for future events. The chronology also lists participant agencies, vendors and volunteers.

In addition to the archive, the Center for Geospatial Innovation is developing outreach to GIS and Geospatial oriented groups to advise on research and development activities. COGITO: the NYC Coalition of Geospatial Information and Technology Organizations is comprised of leaders of several NYC-area and regional GIS groups. It serves as the center of an organized geospatial ecosystem in NYC and is developing activities to keep its constituent members informed of GIS opportunities, education and resources in the region.

COGITO participants include local and national GIS associations, Meetup and affinity groups, as well as university spatial data and visualization labs, including Pratt SAVI, Hunter College, CUNY Graduate School, Columbia and others. We also work with GIS offices throughout New York State to report on tools and processes that can build resilience in other local areas.

The vision for the Center for Geospatial Innovation is a City that has the ability to bounce back, Resilience, through collaboration, communication and transparency, to meet challenges like climate change, “bad actors”, or anything else that comes our way. And to recognize the historical importance and value of those who participated in creating the systems that make our City resilient. We welcome you to participate and learn about the geospatial tools that support our City’s ability to return from adversities, stronger and better prepared.

Thanks again for the opportunity to present and congratulations graduates!

If you would like to learn more about COGITO, the 9/11 Geographic Archive or if you have materials or stories that may be of interest to our future researchers and partners, please feel free to contact me.

Correction: The Center for Geospatial Innovation was referenced as the NYC Geospatial Technology Center in the original talk. Center for Geospatial Innovation is the correct name of the institution.

The Best User Experience Education Programs in NYC

I answer questions about UX, Information Architecture and other topics on Quora. A selection of these answers will be reposted on Medium with occasional, minor editing for clarity.

What are the best User Experience education programs based in New York? Are there any college accredited programs?

NYU has a degree in its Interactive Telecommunications Program (ITP) which I believe is one of the oldest, if not the oldest, Masters program in the field.

School of Visual Arts has a highly regarded MFA program in Interaction Design. The founder Liz Danzico is now head of UX at NPR.

Pratt School of Information just started offering a masters in Information Experience Design (M.S.). I took the courses for a UX concentration for my MSLIS degree before the program was formalized in 2015. The teaching staff is quite good, with many industry leaders in adjunct roles. Pratt also offers a design BFA and MFA that includes UX.

Parsons School of Design at the New School offers undergraduate and graduate programs that cover UX but are along more traditional disciplines (communications design, product design, industrial design, design and technology, transdisciplinary design). The idea is that UX is necessary within all these areas.

There are several professional programs that offer Certificate/boot camp courses. Recruiters are mixed on the value of the programs, but they are good continuing education options or for transferring into the field. They are not highly regarded as a replacement for a BA, though could be an alternative to a Masters degree.

An alternative might be to enter into a traditional Library and Information Science program. There are a few accredited LIS programs in the NYC area, including Pratt Institute, Queens College, St. Johns University and Long Island University in New York City and Rutgers University in New Brunswick, New Jersey. These programs teach the fundamentals of knowledge organization, information seeking behavior, taxonomic theory and related technologies, and have a growing interest in providing courses in user experience topics.

Site Maps and Variations: A Real World Review

A sitemap is a useful tool both for users and designers of a web interface. Designers use sitemaps to visualize website navigation, while users rely on them in combination with search and browsing to help them find what they are looking for. A third user, search engines, rely on sitemaps in part to index websites. Increasingly, designers are integrating sitemaps and subsets of sitemaps directly into website navigation. For this report, I reviewed the literature on sitemaps to see how they have evolved as a tool for design and navigation and explore new uses and variations.

Background and Definitions

Much of the earliest literature on sitemaps describe them as a tool to help users navigate a website. A sitemap in this context is a page on a website that displays an organized list of links to every page on the site. In the 2002 edition of Elements of User Experience, Jesse James Garrett describes sitemaps as tools designers use to develop the structure of a website.

Sitemaps are also used as tools for design. Dan Brown (2011, p. 94) defines sitemaps as “A visual representation of the relationships between different pages on a web site. Also known as a structural model, taxonomy, hierarchy, navigation model, or site structure.” According to Brown, website designers, developers, project managers and stakeholders use site maps to:

  • Clarify information hierarchy,
  • Establish a navigational backbone, and
  • Facilitate content migrations.

 

Because of the dual use of sitemaps as a tool for design and as a navigational feature of a website, Jesse James Garrett recommends referring to a sitemap, the design tool, as an architectural diagram (Garrett, 2002, p. 17). Peter Morville and Louis Rosenfeld (2002) prefer the term blueprint, which also borrows from the terminology of physical architecture, but can be confused with other information architecture tools, such as wireframes, which employ the conventions of architectural blueprints, such as monochrome color scheme and annotation. In referring to the design tool, I prefer the term sitemap, because it conveys a sense of the orientation and context of site content, rather than instructions for building a site. 

Sitemaps as a Design Tool

Designers use sitemaps to develop a new site structure, create a current view of website content and to analyze gaps in content or problems in the arrangement of content (Brown, p. 95). Sitemaps are often created using a drawing tool such as Microsoft Visio or PowerPoint, but can also be represented in Excel. Drawing tools allow you to map relationships between pages by grouping boxes and connecting them with lines, much like an organizational chart. The following sitemap is the result of importing an Excel file into Visio:

sample-visio-sitemap

Figure 1: Sitemap from The Lazy IA’s Guide to Making Sitemaps (Turbek, 2006).

The presentation of a sitemap as a series of hierarchical boxes with interconnecting lines is useful in diagramming the relationships between content on a section of a website or within an entire website. These diagrams are rarely seen by website users, though the resulting structure informs the site navigational structure of the website and may be represented in a page on the website itself.

Sitemaps as a Navigation Tool

From a user’s perspective, “A site map’s main benefit is to give users an overview of the site’s areas in a single glance by dedicating an entire page to a visualization of the information architecture” (Nielsen, 2002). We think of user experience as a somewhat recent development in the design of interactive products. However, in an even earlier article about sitemaps, Peter van Dijck (1999) stresses that the user experience is paramount when developing a sitemap.

Van Dijck says that because users are looking for something specific on a website, the sitemap should be complete, so they can find it. Second, he says that since users are on a website to find information that is enjoyable and of interest, the sitemap should also include meta-information to help the user decide if the effort of clicking on a link is worth it. These are two requirements for sitemaps: completeness and meta-information. He suggests grouping pages within like ideas, highlighting the most popular pages in bold or color coding links. Most sitemaps follow this concept by grouping pages by top-level navigation and utility, or meta, navigation. Some also group links into facets in addition to the more traditional sections, but the goal is to present a visualization of every page on the website in one place, which is useful as a secondary navigation tool.

A 2002 study by Jakob Nielson provided a number of suggestions for effective sitemaps. It recommended that websites include a link to the sitemap on every page, clearly labeled “site map.” It also indicated that sitemaps should be relatively short, no more than two and a half times the window size. It recommended that layers of hierarchy are acceptable if designed well, i.e. simple and compact. It also suggested that interactive sitemaps can be confusing and should be avoided.

As for level of detail, Garrett (2002) states that a sitemap on a website should only show one to two levels of the site, rather than a complete map of every page. Nielsen, however, says a complete map is best for small sites, but that several levels can be displayed if the sitemap is designed well. Search engine optimization expert, Shari Thurow, recommends having a complete sitemap to improve accessibility by search engines (S. Thurow, personal communication, November 29, 2013).

Nielson also concluded, “…users are reluctant to use site maps and sometimes have problems even finding them.” In the study, only 27% of users turned to sitemaps to find information. Nielsen found that less than half of the sites included in his book, Homepage Usability, had sitemaps at all. Nielsen inferred that perhaps users didn’t expect a site to have a sitemap and simply didn’t look for them.

In a 2008 follow-up to his first sitemap study, Nielsen reported that sitemaps remain useful as a secondary navigation aid, despite the fact that sitemap usage fell to only 7% of users in the later study versus 27% in 2002, because “they’re the only feature that gives users a true overview of everything on a site.” He justified sitemaps because they don’t hurt people who don’t use them, they help those who do use them, and they are relatively inexpensive to produce (Nielsen, 2008).

Sitemap Alternatives

Nielsen notes, “Users go to site maps if they are lost, frustrated, or looking for specific details on a crowded site” (Nielsen, 2002). He likes sitemaps because they offer a clean, simple website overview and represent an alternative finding aid in what may be a cluttered website. I thought it might be interesting to look at some of the alternative variations on sitemaps that help users find content. It turns out that Nielsen has much to say about these alternatives as well.

Mega Drop-Down Menus

Nielsen (2009) refers to a study of “mega drop-down menus,” which are navigational features, often integrated with global navigation. Mega drop-down menus, also described as “fat menus,” are ways for site designers to include subpage navigation by segmenting content into groupings or popular links.

According to Nielsen, mega drop-down menus have the following characteristics:

  • Large, two-dimensional panels with options divided into groups
  • Choices are structured through layout and typography, and may use icons
  • Everything is visible at the same time without scrolling
  • Content is displayed in a vertical or horizontal form (drop-down for horizontal global navigation and fly-out for left navigation bars).

 

Mega drop-down menus allow the addition of featured content as a way to promote or market specific pages within the selected section. The Wall Street Journal includes featured content, such as blogs, videos and magazine content in their mega drop-down menus:

wsj-dropdown-sitemap

Figure 2 Wall Street Journal Mega Drop-Down Menu

Nielsen’s 2009 study concludes that Mega drop-downs work because “…they show everything at a glance, so users can see rather than try to remember.” This goes back to Nielsen’s earlier work on usability that recommends interfaces that rely on recognition rather than recall (Nielsen, 1995).

Why else do mega drop-down menus work? First typical drop-down menus often hide options. Mega drop-down menus contain all of the user’s options for a particular section in a single panel so they do not need to rely on memory. Also, regular drop-down menus don’t support groupings while a mega drop-down’s card or panel layout does. Finally, mega drop-down menus make it easy to include rich typography, images and illustrations.

Retail often also include fat menus with links to sales, promotional features and facets like “Gifts for Her,” which drive sales toward specific pages, as in the following fat menu from Bodenusa.com:

boden-footer-sitemap

Figure 3 Boden Mega Drop-Down Menu

Applying a sitemap as a tool for design, an information architect can look at a shopper’s mental model and user scenarios when shopping within a particular category, and suggest facets and features to include in fat menus. The resulting menu design can then be tested on actual users to determine if the links are appropriate and if they are likely to drive sales.

Doormat Menus

Similar to the mega drop-down menu is the “doormat navigation” design pattern described in the Nokia Developers website and Welie.com. Both display a panel of organized links to content, except the doormat navigation is typically only displayed on the home page, thus the “doormat” reference, and often displays each top-level navigation item in the same panel with sub-groupings of links, rather than focusing on a single top-level section (Welie.com).

Welie.com’s description of doormat navigation suggests that the panel contains sub-section links for each major section, while the Nokia Developers site shows doormat navigation images that are specific to a single site section (Mobile web design: doormat navigation, n.d.). Both Welie.com and Nokia note that doormat navigation is best placed on the home page of large websites with a lot of content, while Neilsen assumes mega drop-down menus would be accessible from every page of the website.

Doormat navigation may be falling out of favor or evolving toward the mega drop-down menu format. Welie.com displays an example of a doormat navigation from Adobe.com; however, Adobe’s current website features mega drop-down menus.

Footer Sitemaps

Van Dijck’s 1999 article is also a very early example of placing the sitemap on the footer of every page. An example of this method is presented on the travel site, Poor But Happy: Colombia. In fact, footer sitemaps, also called “fat footers,” do not appear to have been common before 2009, based on a scan of literature on the topic.

move-to-colombia-sitemap

Figure 4 Poor But Happy: Colombia Sitemap Footer (van Dijck, 2000).

In contrast to van Dijck’s sitemap footer, recent fat footers typically present an abbreviated list of links to major sub-pages of each top-level section, rather than a complete sitemap. Below is a fat footer from the Wall Street Journal:

wsj-footer-sitemap

Figure 5 Wall Street Journal Footer

In Jakob Neilsen’s 2012 article on “SEO and Usability,” he cautions against attempting to stuff fat footers or menus with an entire website. Carefully curated footer content can strengthen structural SEO by guiding link juice to the site’s best pages about each key topic” (Nielsen, 2012). This needs to be balanced by keeping the information compact and scannable enough to prevent confusing the user.

I interviewed search engine optimization expert, Shari Thurow, to explore further the effect of fat menus and footers on search engine optimization and findability. She agrees with Nielsen. Thurow said, “Too many links on a page, unless it is a page where humans expect a lot of links (site map/index, top-level category page, View all page, etc.) actually impedes findability. So mega menus should not be too large, ironically” (S. Thurow, personal communication, November 29, 2013). Having a complete sitemap on every page, as van Dijck recommended, may dilute the effectiveness of the search engine optimization strategy as well.

Thurow notes that it is also helpful that menus are coded in a way that is crawlable by search engines and visible to screen readers. Not all menu scripts are accessible. She further recommends that a sitemap page include not only organized links of all site content, but also annotations that provide “…keyword-rich content for the search engines to spider” (Thurow & Musica, 2009, p. 118).  Additionally, Thurow suggests that fat menus and sitemaps by themselves are not enough.  She notes that “..wayfinder pages, like a Brands A-Z page, are really helpful for both humans and search engines” (S. Thurow, personal communication, November 29, 2013).

XML Sitemaps

The ability of search engines to spider library catalogs and other so-called deep Web content is a challenge. Vinit Kumar of the Institute of Library and Information Science, Bundelkhand University, experimented with developing an XML sitemap that can be read by search engines to expose OPAC content to search engines (Kumar, 2012). He presents two ways to create these sitemaps via locally run, hand-coded routines or remote sitemap generation services (p. 495-496). Some local routines include GSiteCrawler (http://gsitecrawler.com/), G-Mapper (http://www.dbnetsolutions.co.uk/gmapper/), and WebDesignPros Sitemap Generator (http://www.webdesignpros.ca).

Remote services include Google Webmaster tools and search engine-specific tools that access a website’s sitemap page. Kumar notes that for most of the remote services, the OPAC data needs to be publicly available in order to be spidered. If the OPAC data is obscure, then additional coding or a local option may be necessary.

XML sitemaps also provide the benefit of encoding semantic web metadata, which offer another tool for the information architect to express meaning within a sitemap. Further research is required to test how XML sitemaps may offer benefits such as improved search and discovery.

The Future of Sitemaps

According to Dan Brown, sitemaps continue to be useful in representing content for structuring websites, despite the increasing complexity of the Web (Brown, 2011, p. 122). Sitemaps are effective tools for conceiving the overall structure of a website, but are limited in their ability to depict dynamic and user generated content accurately.

Victor Lombardi, a founder of the IA Institute, recently started a thread on iai-members discussion list on how to create a sitemap for data (Lombardi, 2013). Dorian Taylor replied with a suggestion to use Gephi, a data visualization tool, to represent the content in a network graph. In an earlier blog post, Taylor presented an example of a Gephi visualization of the IA Institute’s website that he created by using RDF data obtained from a crawler he designed (Taylor, 2011). The resulting graph presents a semantic diagram that according to Taylor precludes the need for sections because the necessary semantic relationships are already built into the RDF metadata for the links between pages.

iai-gephi-sitemap

Figure 6: Gephi Sitemap Diagram of IAInstitute.org

A sitemap as network graph is an interesting idea, but it would require user testing to understand whether website users find it easy to use. As noted above, Nielsen (2002) suggests that interactive maps should be avoided because it may confuse users. On the other hand, this type of visualization is excellent for analyzing gaps in content and natural groupings of concepts and information. That a sitemap such as this can be generated on the fly presents an interesting alternative for mapping dynamic websites and could use more exploration.

Nielsen (2002) predicted that updated versions of browsers, such as Internet Explorer, might include tools that automatically produce a usable, interactive sitemap, but pokes fun at Microsoft for not adopting feature requests quickly. Kumar returns to the idea of built-in browser assistance by noting that libraries could develop a browser plug-in to support search of library OPACs. This would allow patrons to search for catalog content within their browser’s search bar rather than directly in the OPAC. Another approach could be to code the search results so that catalog content is presented first in the results page. Without additional coding, it remains true that the designer must optimize website navigation rather than expect the browser to provide sitemap tools.

Conclusion

Sitemaps are useful tools for website design, search and navigation. Because of the evolving nature and increasing complexity of the Web, the role of the sitemap is also evolving. The sitemap is an important aid to navigating complexity for all users, whether designers, stakeholders, site users. As Brown (2011, p. 122) suggests in his conclusion to the Site Map chapter, the growing complexity is making sitemaps more important, not less so. With the increasing power of visualization and semantic web tools, we can anticipate interesting new uses of sitemaps as an analytical and navigation tool.

 

References

Brown, D. M. (2011). Communicating design: Developing web site documentation for design and planning. Berkeley, CA: New Riders.

Doormat navigation. Welie.com. Web. Retrieved on November 20, 2013 from http://www.welie.com/patterns/showPattern.php?patternID=doormat

Garrett, J. J. (2002). Elements of user experience. Indianapolis, IN: New Riders.

Garrett, J. J. (2011). The elements of user experience: User-centered design for the Web and beyond. Berkeley, CA: New Riders.

Kumar, V. (2012). Exposing library catalogues to search engines. DESIDOC Journal Of Library & Information Technology, 32(6), 493-497.

Mobile web design: Doormat navigation. Nokia Developer. Web. Retrieved on November 20, 2013 from http://developer.nokia.com/Community/Wiki/Mobile_Web_Design_:_Doormat_Navigation

Rosenfeld, L., & Morville, P. (2002). Information architecture for the World Wide Web. Sebastopol, Calif: O’Reilly.

Nielsen, J. (March 23, 2009). Mega menus work well for site navigation. Web. Retrieved on November 29, 2013 from http://www.nngroup.com/articles/mega-menus-work-well/

Nielsen, J. (September 2, 2008). Site map usability. Web. Retrieved on November 29, 2013 http://www.nngroup.com/articles/site-map-usability/

Nielsen, J. (January 6, 2002). Site map usability, 1st study. Web. Retrieved on November 29, 2013 from http://www.nngroup.com/articles/site-map-usability-1st-study/

Nielsen, J. (January 1, 1995). Ten Usability Heuristics. Web. Retrieved on November 29, 2013 from http://www.useit.com/papers/heuristic/heuristic_list.html

Nielsen, J. (August 13, 2012). SEO and Usability. Web. Retrieved on December 1, 2013 from http://www.nngroup.com/articles/seo-and-usability/

Nielsen, J., & Tahir, M. (2002). Homepage usability: 50 websites deconstructed. Indianapolis, IN: New Riders.

Taylor, D. (2013, November 26). Site maps from data? Message posted to http://lists.iainstitute.org/private.cgi/iai-members-iainstitute.org/2013-November/010838.html

Taylor, D. (2011). Content Robo Inventory.  Web. Retrieved on December 1, 2013 from http://doriantaylor.com/content-robo-inventory

Thurow, S., & Musica, N. (2009). When search meets web usability. Berkeley, Calif: New Riders.

Turbeck, S. (January 30, 2006). The lazy IA’s guide to making sitemaps. Boxes and Arrows. Retrieved on December 1, 2013 from http://boxesandarrows.com/the-lazy-ias-guide-to-making-sitemaps/

van Dijck, P. (1999). The problem(s) with sitemaps. Web. Retrieved on November 29, 2013 from http://evolt.org/node/710/

van Dijck, P. (2000). Poor hut happy: Colombia. Web. Retrieved on November 29, 2013 from  http://web.archive.org/web/20001204205500/http://poorbuthappy.com/colombia/index.html